jonigl/mcp-client-for-ollama

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

69
/ 100
Established

# Technical Summary Implements stdio, SSE, and HTTP transport protocols for MCP server communication with automatic reconnection and hot-reload capabilities during development. Built as a Python TUI using modern libraries (Typer, Rich, Textual) that connects Ollama models—both local and cloud-hosted—to MCP tool ecosystems for agentic workflows with iterative tool execution loops. Supports cross-language servers (Python/JavaScript), integrates Claude's native MCP configurations via auto-discovery, and provides safety mechanisms like human-in-the-loop approval gates before tool execution.

563 stars and 3,964 monthly downloads. Actively maintained with 1 commit in the last 30 days. Available on PyPI.

Maintenance 13 / 25
Adoption 18 / 25
Maturity 18 / 25
Community 20 / 25

How are scores calculated?

Stars

563

Forks

82

Language

Python

License

MIT

Last pushed

Feb 19, 2026

Monthly downloads

3,964

Commits (30d)

1

Dependencies

5

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/jonigl/mcp-client-for-ollama"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.