helicone and langwatch
These are competitors offering overlapping LLM observability capabilities (monitoring, tracing, and evaluation), though Helicone has achieved significantly greater adoption and maturity as an established YC-backed platform.
About helicone
Helicone/helicone
🧊 Open source LLM observability platform. One line of code to monitor, evaluate, and experiment. YC W23 🍓
Operates as a reverse proxy AI gateway that intercepts requests to 100+ LLM providers through a unified OpenAI-compatible API, enabling intelligent routing and automatic fallbacks. Built on a microservices architecture with a Cloudflare Workers proxy layer for request interception, Express-based collection server (Jawn), ClickHouse for analytics, and Supabase for application data. Integrates with OpenAI, Anthropic, Gemini, LangChain, Vercel AI SDK, and supports self-hosting via Docker or Helm with optional async logging through OpenLLMetry.
About langwatch
tenemos/langwatch
The open LLM Ops platform - Traces, Analytics, Evaluations, Datasets and Prompt Optimization ✨
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work