gateway and lm-proxy
These are competitors in the LLM gateway space, both providing multi-provider routing and OpenAI-compatible APIs, though Portkey-AI offers significantly more scale (200+ LLMs, 50+ guardrails) while lm-proxy emphasizes lightweight extensibility as a Python/FastAPI library.
About gateway
Portkey-AI/gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.
Supports multi-modal requests (vision, audio, image models), automatic retries with fallback routing, load balancing, and conditional request routing. Built as a lightweight Node.js application (<1ms latency, 122kb footprint) compatible with OpenAI SDKs and frameworks like LangChain, LlamaIndex, and CrewAI. Includes declarative config-based guardrails for output validation and offers enterprise deployments across AWS, Azure, GCP, and Kubernetes environments.
About lm-proxy
Nayjest/lm-proxy
OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work