codex-odyssey/llm-observability
技術書典#17 - 『俺たちと探究するLLM Observabilityアプリケーションのオブザーバビリティ』で使用するサンプルアプリケーション
This project helps developers understand how their Large Language Model (LLM) applications are performing. By providing an example environment, it demonstrates how to monitor the inputs, outputs, and internal workings of an LLM application. Developers building and testing LLM-powered tools would use this to gain insights into their application's behavior.
No commits in the last 6 months.
Use this if you are a developer learning how to implement observability for your LLM applications.
Not ideal if you are a non-developer seeking to understand LLM performance without delving into application code or monitoring tools.
Stars
18
Forks
—
Language
Jupyter Notebook
License
—
Category
Last pushed
Nov 08, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/codex-odyssey/llm-observability"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
Arize-ai/openinference
OpenTelemetry Instrumentation for AI Observability
vndee/llm-sandbox
Lightweight and portable LLM sandbox runtime (code interpreter) Python library.
apache/hertzbeat
An AI-powered next-generation open source real-time observability system.
traceloop/openllmetry
Open-source observability for your GenAI or LLM application, based on OpenTelemetry
utkuozdemir/nvidia_gpu_exporter
Nvidia GPU exporter for prometheus using nvidia-smi binary