helicone and langwatch

These are competitors offering overlapping LLM observability capabilities (monitoring, tracing, and evaluation), though Helicone has achieved significantly greater adoption and maturity as an established YC-backed platform.

helicone
81
Verified
langwatch
36
Emerging
Maintenance 20/25
Adoption 16/25
Maturity 25/25
Community 20/25
Maintenance 13/25
Adoption 2/25
Maturity 9/25
Community 12/25
Stars: 5,237
Forks: 494
Downloads: 292
Commits (30d): 7
Language: TypeScript
License: Apache-2.0
Stars: 2
Forks: 1
Downloads:
Commits (30d): 0
Language: TypeScript
License:
No risk flags
No Package No Dependents

About helicone

Helicone/helicone

🧊 Open source LLM observability platform. One line of code to monitor, evaluate, and experiment. YC W23 🍓

Operates as a reverse proxy AI gateway that intercepts requests to 100+ LLM providers through a unified OpenAI-compatible API, enabling intelligent routing and automatic fallbacks. Built on a microservices architecture with a Cloudflare Workers proxy layer for request interception, Express-based collection server (Jawn), ClickHouse for analytics, and Supabase for application data. Integrates with OpenAI, Anthropic, Gemini, LangChain, Vercel AI SDK, and supports self-hosting via Docker or Helm with optional async logging through OpenLLMetry.

About langwatch

tenemos/langwatch

The open LLM Ops platform - Traces, Analytics, Evaluations, Datasets and Prompt Optimization ✨

Scores updated daily from GitHub, PyPI, and npm data. How scores work