ollama and ollamaclient
About ollama
ollama/ollama
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
Provides a unified local inference runtime built on llama.cpp with REST API and language bindings (Python, JavaScript) for seamless integration into applications and agents. Features a model library with automatic download/management, streaming responses, and launcher integrations for Claude, Codex, and other external tools. Ecosystem spans 100+ community projects including web UIs (Open WebUI, LibreChat), desktop clients (AnythingLLM, Cherry Studio), and IDE extensions (Continue, Cline).
About ollamaclient
xyproto/ollamaclient
Go package and example utilities for using Ollama / LLMs
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work