one-api and fastapi-web
These are competitors offering overlapping LLM API aggregation and management functionality—both provide unified API adapters across multiple LLM providers with Docker deployment, so users would typically select one based on feature completeness and ecosystem maturity rather than using them together.
About one-api
songquanpeng/one-api
LLM API 管理 & 分发系统,支持 OpenAI、Azure、Anthropic Claude、Google Gemini、DeepSeek、字节豆包、ChatGLM、文心一言、讯飞星火、通义千问、360 智脑、腾讯混元等主流模型,统一 API 适配,可用于 key 管理与二次分发。单可执行文件,提供 Docker 镜像,一键部署,开箱即用。LLM API management & key redistribution system, unifying multiple providers under a single API. Single binary, Docker-ready, with an English UI.
Implements request routing with automatic failover across channels, supporting load balancing and token-level quotas with granular controls (expiration, IP restrictions, model whitelist). Built in Go with a React frontend, it exposes a complete management API for programmatic integration and extensibility without code modification.
About fastapi-web
iimeta/fastapi-web
企业级 LLM API 快速集成系统,支持OpenAI、Azure、文心一言、讯飞星火、通义千问、智谱GLM、Gemini、DeepSeek、Anthropic Claude以及OpenAI格式的模型等,简洁的页面风格,轻量高效且稳定,支持Docker一键部署。
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work