mcp-client-for-ollama and zin-mcp-client

These are competitors: both provide TUI/client interfaces to connect local Ollama LLMs to MCP servers, with the first being more mature and feature-rich (agent mode, multi-server support, tool management) while the second is a simpler alternative for the same use case.

mcp-client-for-ollama
69
Established
zin-mcp-client
37
Emerging
Maintenance 13/25
Adoption 18/25
Maturity 18/25
Community 20/25
Maintenance 0/25
Adoption 9/25
Maturity 9/25
Community 19/25
Stars: 563
Forks: 82
Downloads: 3,964
Commits (30d): 1
Language: Python
License: MIT
Stars: 99
Forks: 20
Downloads:
Commits (30d): 0
Language: Python
License: Apache-2.0
No risk flags
Archived Stale 6m No Package No Dependents

About mcp-client-for-ollama

jonigl/mcp-client-for-ollama

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

# Technical Summary Implements stdio, SSE, and HTTP transport protocols for MCP server communication with automatic reconnection and hot-reload capabilities during development. Built as a Python TUI using modern libraries (Typer, Rich, Textual) that connects Ollama models—both local and cloud-hosted—to MCP tool ecosystems for agentic workflows with iterative tool execution loops. Supports cross-language servers (Python/JavaScript), integrates Claude's native MCP configurations via auto-discovery, and provides safety mechanisms like human-in-the-loop approval gates before tool execution.

About zin-mcp-client

zinja-coder/zin-mcp-client

MCP Client which serves as bridge between mcp servers and local LLMs running on Ollama, Created for MCP Servers Developed by Me, However other MCP Servers may run as well

Implements a ReAct agent framework using LangChain to intelligently route tool invocations across multiple MCP servers via stdio transport, with support for local LLM reasoning through Ollama. Offers multiple interfaces—CLI, lightweight web UI, and Open Web UI plugin—enabling flexible deployment from personal development to integrated environments. Designed as a minimal, performant bridge prioritizing stdio-based MCP server compatibility over feature bloat.

Scores updated daily from GitHub, PyPI, and npm data. How scores work