gateway and lm-proxy

These are competitors in the LLM gateway space, both providing multi-provider routing and OpenAI-compatible APIs, though Portkey-AI offers significantly more scale (200+ LLMs, 50+ guardrails) while lm-proxy emphasizes lightweight extensibility as a Python/FastAPI library.

gateway
77
Verified
lm-proxy
62
Established
Maintenance 16/25
Adoption 17/25
Maturity 25/25
Community 19/25
Maintenance 10/25
Adoption 16/25
Maturity 24/25
Community 12/25
Stars: 10,885
Forks: 936
Downloads: 968
Commits (30d): 3
Language: TypeScript
License: MIT
Stars: 92
Forks: 10
Downloads: 1,044
Commits (30d): 0
Language: Python
License: MIT
No risk flags
No risk flags

About gateway

Portkey-AI/gateway

A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.

Supports multi-modal requests (vision, audio, image models), automatic retries with fallback routing, load balancing, and conditional request routing. Built as a lightweight Node.js application (<1ms latency, 122kb footprint) compatible with OpenAI SDKs and frameworks like LangChain, LlamaIndex, and CrewAI. Includes declarative config-based guardrails for output validation and offers enterprise deployments across AWS, Azure, GCP, and Kubernetes environments.

About lm-proxy

Nayjest/lm-proxy

OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.

Scores updated daily from GitHub, PyPI, and npm data. How scores work