ollama/ollama
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
Provides a unified local inference runtime built on llama.cpp with REST API and language bindings (Python, JavaScript) for seamless integration into applications and agents. Features a model library with automatic download/management, streaming responses, and launcher integrations for Claude, Codex, and other external tools. Ecosystem spans 100+ community projects including web UIs (Open WebUI, LibreChat), desktop clients (AnythingLLM, Cherry Studio), and IDE extensions (Continue, Cline).
164,987 stars. Actively maintained with 118 commits in the last 30 days.
Stars
164,987
Forks
14,940
Language
Go
License
MIT
Category
Last pushed
Mar 13, 2026
Commits (30d)
118
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/ollama/ollama"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
jd-opensource/JDOxyGent4J
JDOxyGent4J: The Java sibling of the OxyGent ecosystem.
dext7r/ollama-api-pool
๐ Intelligent Ollama API proxy pool based on Cloudflare Workers - ๅบไบ Cloudflare Workers ็ๆบ่ฝ...
sammcj/gollama
Go manage your Ollama models
nandlabs/golly
golly is a open source library for go
ollama4j/ollama4j-web-ui
Web UI for Ollama built in Java with Vaadin, Spring Boot and Ollama4j