OpenLLM and free-llm-api-resources

These are complements: OpenLLM provides the infrastructure to deploy and serve open-source LLMs as APIs, while the free resources list helps users discover which models and inference endpoints are available to deploy or integrate with such services.

OpenLLM
74
Verified
free-llm-api-resources
54
Established
Maintenance 13/25
Adoption 18/25
Maturity 25/25
Community 18/25
Maintenance 16/25
Adoption 10/25
Maturity 8/25
Community 20/25
Stars: 12,161
Forks: 803
Downloads: 4,730
Commits (30d): 0
Language: Python
License: Apache-2.0
Stars: 15,475
Forks: 1,538
Downloads:
Commits (30d): 4
Language: Python
License:
No risk flags
No License No Package No Dependents

About OpenLLM

bentoml/OpenLLM

Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.

About free-llm-api-resources

cheahjs/free-llm-api-resources

A list of free LLM inference resources accessible via API.

Curated directory of 25+ legitimate LLM API providers offering free tier or trial credits, with detailed rate limits, supported models, and verification requirements for each service. The README is automatically generated from a source script that tracks available models across providers like OpenRouter, Google AI Studio, Groq, and Mistral, enabling developers to quickly identify which free endpoints support specific model architectures and throughput needs. Covers both fully free services and credit-based trials, with explicit filtering to exclude reverse-engineered or unofficial APIs.

Scores updated daily from GitHub, PyPI, and npm data. How scores work