helicone and openinspector

These are competitors, as both offer open-source LLM observability platforms to monitor, log, and trace LLM interactions.

helicone
81
Verified
openinspector
22
Experimental
Maintenance 20/25
Adoption 16/25
Maturity 25/25
Community 20/25
Maintenance 13/25
Adoption 0/25
Maturity 9/25
Community 0/25
Stars: 5,237
Forks: 494
Downloads: 292
Commits (30d): 7
Language: TypeScript
License: Apache-2.0
Stars:
Forks:
Downloads:
Commits (30d): 0
Language: TypeScript
License: Apache-2.0
No risk flags
No Package No Dependents

About helicone

Helicone/helicone

🧊 Open source LLM observability platform. One line of code to monitor, evaluate, and experiment. YC W23 🍓

Operates as a reverse proxy AI gateway that intercepts requests to 100+ LLM providers through a unified OpenAI-compatible API, enabling intelligent routing and automatic fallbacks. Built on a microservices architecture with a Cloudflare Workers proxy layer for request interception, Express-based collection server (Jawn), ClickHouse for analytics, and Supabase for application data. Integrates with OpenAI, Anthropic, Gemini, LangChain, Vercel AI SDK, and supports self-hosting via Docker or Helm with optional async logging through OpenLLMetry.

About openinspector

as32608/openinspector

A lightweight, local-first observability proxy and dashboard designed to intercept, log, and trace LLM interactions. OpenInspector acts as a transparent middleman, offering full visibility into agentic workflows, tool executions, and latency metrics without requiring you to change a single line of your application code.

Scores updated daily from GitHub, PyPI, and npm data. How scores work