gateway and coai

Both are unified LLM gateway platforms that compete on the same core value proposition—routing requests across 200+ models with guardrails and provider abstraction—making them direct competitors rather than complementary tools.

gateway
77
Verified
coai
64
Established
Maintenance 16/25
Adoption 17/25
Maturity 25/25
Community 19/25
Maintenance 16/25
Adoption 10/25
Maturity 16/25
Community 22/25
Stars: 10,885
Forks: 936
Downloads: 968
Commits (30d): 3
Language: TypeScript
License: MIT
Stars: 9,008
Forks: 1,188
Downloads:
Commits (30d): 1
Language: TypeScript
License: Apache-2.0
No risk flags
No Package No Dependents

About gateway

Portkey-AI/gateway

A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.

Supports multi-modal requests (vision, audio, image models), automatic retries with fallback routing, load balancing, and conditional request routing. Built as a lightweight Node.js application (<1ms latency, 122kb footprint) compatible with OpenAI SDKs and frameworks like LangChain, LlamaIndex, and CrewAI. Includes declarative config-based guardrails for output validation and offers enterprise deployments across AWS, Azure, GCP, and Kubernetes environments.

About coai

coaidev/coai

🚀 Next Generation Multi-tenant AI One-Stop Solution. Builtin Admin & Billing System. Enterprise-Grade Unified LLM Gateway Support for 200+ Models And 35+ Providers, Load Balacing w/ Priority-base Routing, Cost Management, Chat Share, Cloud Sync, Credit/Subscription Billing, All File Parsing, Web Search, Built-in Model Cache.

Implements a decoupled microservices architecture with a separate blob service for file parsing (supporting S3/R2/MinIO backends and OCR), integrates SearXNG for multi-engine web search, and provides OpenAI API-compatible endpoints enabling proxying of 200+ models through a single deployment. Built with Next.js frontend (Shadcn UI + Tremor) and supports PWA/Tauri desktop clients, with zero-cost cross-device sync via database-backed conversation persistence rather than external storage solutions.

Scores updated daily from GitHub, PyPI, and npm data. How scores work