mcp-client-for-ollama and llm_mcp

mcp-client-for-ollama
75
Verified
llm_mcp
32
Emerging
Maintenance 13/25
Adoption 18/25
Maturity 24/25
Community 20/25
Maintenance 10/25
Adoption 1/25
Maturity 9/25
Community 12/25
Stars: 563
Forks: 82
Downloads: 3,964
Commits (30d): 1
Language: Python
License: MIT
Stars: 1
Forks: 1
Downloads:
Commits (30d): 0
Language: Python
License: Apache-2.0
No risk flags
No Package No Dependents

About mcp-client-for-ollama

jonigl/mcp-client-for-ollama

A text-based user interface (TUI) client for interacting with MCP servers using Ollama. Features include agent mode, multi-server, model switching, streaming responses, tool management, human-in-the-loop, thinking mode, model params config, MCP prompts, custom system prompt and saved preferences. Built for developers working with local LLMs.

# Technical Summary Implements stdio, SSE, and HTTP transport protocols for MCP server communication with automatic reconnection and hot-reload capabilities during development. Built as a Python TUI using modern libraries (Typer, Rich, Textual) that connects Ollama models—both local and cloud-hosted—to MCP tool ecosystems for agentic workflows with iterative tool execution loops. Supports cross-language servers (Python/JavaScript), integrates Claude's native MCP configurations via auto-discovery, and provides safety mechanisms like human-in-the-loop approval gates before tool execution.

About llm_mcp

AuraFriday/llm_mcp

MCP server that runs local LLMs (with full access to MCP tools included). Callable by Python to chain MCP tools with local intelligence.

Scores updated daily from GitHub, PyPI, and npm data. How scores work