BerriAI/litellm

Python SDK, Proxy Server (AI Gateway) to call 100+ LLM APIs in OpenAI (or native) format, with cost tracking, guardrails, loadbalancing and logging. [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, VLLM, NVIDIA NIM]

98
/ 100
Verified

Supports A2A agent protocols (LangGraph, Vertex AI, Bedrock, Pydantic AI) and MCP tool servers, allowing seamless integration of agentic workflows and tool ecosystems into any LLM. Implements a unified request/response format across all providers, automatically translating native API schemas to OpenAI-compatible endpoints for interoperability. Available as both a Python SDK and stateless proxy server that can be deployed independently or containerized as a central gateway for multi-tenant LLM access.

38,910 stars and 95,199,449 monthly downloads. Used by 162 other packages. Actively maintained with 2,000 commits in the last 30 days. Available on PyPI.

Maintenance 25 / 25
Adoption 25 / 25
Maturity 25 / 25
Community 23 / 25

How are scores calculated?

Stars

38,910

Forks

6,381

Language

Python

License

Category

llm-api-gateways

Last pushed

Mar 13, 2026

Monthly downloads

95,199,449

Commits (30d)

2000

Dependencies

12

Reverse dependents

162

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/BerriAI/litellm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.