The LLM Inference Directory
Quality-scored directory of 0 llm inference engines, updated daily. Every engine scored on maintenance, adoption, maturity, and community signals.
LLM inference engines for self-hosted model serving — vLLM, TGI, SGLang, Ollama, llama.cpp, and the tools that make local inference fast and efficient.
Verified
0
70–100
Established
0
50–69
Emerging
0
30–49
Experimental
0
10–29
Top engines by quality score
| # | Engine | Score |
|---|