lm-proxy and LLM-API-Key-Proxy
These are competitors offering overlapping core functionality—both provide OpenAI-compatible HTTP gateways for multi-provider LLM inference—though lm-proxy emphasizes lightweight library usage while LLM-API-Key-Proxy adds intelligent load-balancing features.
Maintenance
10/25
Adoption
16/25
Maturity
24/25
Community
12/25
Maintenance
10/25
Adoption
10/25
Maturity
15/25
Community
22/25
Stars: 92
Forks: 10
Downloads: 1,044
Commits (30d): 0
Language: Python
License: MIT
Stars: 418
Forks: 76
Downloads: —
Commits (30d): 0
Language: Python
License: —
No risk flags
No Package
No Dependents
About lm-proxy
Nayjest/lm-proxy
OpenAI-compatible HTTP LLM proxy / gateway for multi-provider inference (Google, Anthropic, OpenAI, PyTorch). Lightweight, extensible Python/FastAPI—use as library or standalone service.
About LLM-API-Key-Proxy
Mirrowel/LLM-API-Key-Proxy
Universal LLM Gateway: One API, every LLM. OpenAI/Anthropic-compatible endpoints with multi-provider translation and intelligent load-balancing.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work