openinference and openllmetry

These are complementary tools, as OpenInference provides a standardized OpenTelemetry semantic convention for AI observability that OpenLLMetry then implements to offer comprehensive observability for GenAI and LLM applications.

openinference
73
Verified
openllmetry
68
Established
Maintenance 22/25
Adoption 10/25
Maturity 16/25
Community 25/25
Maintenance 20/25
Adoption 10/25
Maturity 16/25
Community 22/25
Stars: 886
Forks: 200
Downloads:
Commits (30d): 61
Language: Python
License: Apache-2.0
Stars: 6,906
Forks: 900
Downloads:
Commits (30d): 33
Language: Python
License: Apache-2.0
No Package No Dependents
No Package No Dependents

About openinference

Arize-ai/openinference

OpenTelemetry Instrumentation for AI Observability

This project helps machine learning engineers and AI developers understand the internal workings and performance of their AI applications, especially those using Large Language Models (LLMs). It provides a way to trace the steps and data flow within an AI application, from inputs like user queries to outputs generated by LLMs or external tools. The output is detailed observability data that helps debug, optimize, and monitor AI systems.

AI-observability LLM-development MLOps AI-application-monitoring model-debugging

About openllmetry

traceloop/openllmetry

Open-source observability for your GenAI or LLM application, based on OpenTelemetry

This project helps developers instrument and monitor their Generative AI applications to understand how they are performing. It takes in data about your application's interactions with LLMs and vector databases, and outputs detailed traces and metrics. This allows software engineers building AI-powered products to debug issues, optimize performance, and ensure reliability.

AI application development LLM operations software observability application monitoring debugging

Scores updated daily from GitHub, PyPI, and npm data. How scores work