OpenLLM and free-llm-api-resources
These are complements: OpenLLM provides the infrastructure to deploy and serve open-source LLMs as APIs, while the free resources list helps users discover which models and inference endpoints are available to deploy or integrate with such services.
About OpenLLM
bentoml/OpenLLM
Run any open-source LLMs, such as DeepSeek and Llama, as OpenAI compatible API endpoint in the cloud.
About free-llm-api-resources
cheahjs/free-llm-api-resources
A list of free LLM inference resources accessible via API.
Curated directory of 25+ legitimate LLM API providers offering free tier or trial credits, with detailed rate limits, supported models, and verification requirements for each service. The README is automatically generated from a source script that tracks available models across providers like OpenRouter, Google AI Studio, Groq, and Mistral, enabling developers to quickly identify which free endpoints support specific model architectures and throughput needs. Covers both fully free services and credit-based trials, with explicit filtering to exclude reverse-engineered or unofficial APIs.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work