ollama and gollama

ollama
71
Verified
gollama
42
Emerging
Maintenance 25/25
Adoption 10/25
Maturity 16/25
Community 20/25
Maintenance 6/25
Adoption 10/25
Maturity 9/25
Community 17/25
Stars: 164,987
Forks: 14,940
Downloads:
Commits (30d): 118
Language: Go
License: MIT
Stars: 1,706
Forks: 103
Downloads:
Commits (30d): 0
Language: Go
License: MIT
No Package No Dependents
No Package No Dependents

About ollama

ollama/ollama

Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.

Provides a unified local inference runtime built on llama.cpp with REST API and language bindings (Python, JavaScript) for seamless integration into applications and agents. Features a model library with automatic download/management, streaming responses, and launcher integrations for Claude, Codex, and other external tools. Ecosystem spans 100+ community projects including web UIs (Open WebUI, LibreChat), desktop clients (AnythingLLM, Cherry Studio), and IDE extensions (Continue, Cline).

About gollama

sammcj/gollama

Go manage your Ollama models

A TUI-based model manager for Ollama that goes beyond basic operations with built-in vRAM estimation across quantization levels, Modelfile editing, and advanced sorting/filtering by metadata (size, quantization, family, parameters). The tool communicates directly with the Ollama API and supports both local and remote instances via configurable host endpoints, while also providing CLI modes for scripting and batch operations like search with boolean operators.

Scores updated daily from GitHub, PyPI, and npm data. How scores work