ollama and ollamaclient

ollama
71
Verified
ollamaclient
30
Emerging
Maintenance 25/25
Adoption 10/25
Maturity 16/25
Community 20/25
Maintenance 2/25
Adoption 7/25
Maturity 9/25
Community 12/25
Stars: 164,987
Forks: 14,940
Downloads:
Commits (30d): 118
Language: Go
License: MIT
Stars: 34
Forks: 5
Downloads:
Commits (30d): 0
Language: Go
License: Apache-2.0
No Package No Dependents
Stale 6m No Package No Dependents

About ollama

ollama/ollama

Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.

Provides a unified local inference runtime built on llama.cpp with REST API and language bindings (Python, JavaScript) for seamless integration into applications and agents. Features a model library with automatic download/management, streaming responses, and launcher integrations for Claude, Codex, and other external tools. Ecosystem spans 100+ community projects including web UIs (Open WebUI, LibreChat), desktop clients (AnythingLLM, Cherry Studio), and IDE extensions (Continue, Cline).

About ollamaclient

xyproto/ollamaclient

Go package and example utilities for using Ollama / LLMs

Scores updated daily from GitHub, PyPI, and npm data. How scores work