Nagharjun17/MCP-Ollama-Client

Lightweight MCP client that uses a local Ollama LLM to query multiple MCP servers defined in config.json

35
/ 100
Emerging

This is a command-line chat interface that allows you to interact with various data sources and services using natural language, all running on your own computer. You type in a question or command in plain English, and it uses a local AI model to figure out which tool or service to use, then fetches the answer for you. It's designed for technical professionals who need to query multiple local systems like databases or filesystems without writing code or using cloud AI.

No commits in the last 6 months.

Use this if you need to query or interact with multiple local systems (like a PostgreSQL database or your local filesystem) using natural language, without sending your data to the cloud or writing custom scripts.

Not ideal if you need a graphical user interface, primarily interact with cloud-based services, or require complex data transformations before querying.

data-querying local-automation command-line-tools developer-productivity information-retrieval
Stale 6m No Package No Dependents
Maintenance 2 / 25
Adoption 4 / 25
Maturity 15 / 25
Community 14 / 25

How are scores calculated?

Stars

7

Forks

3

Language

Python

License

MIT

Last pushed

Jul 29, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/Nagharjun17/MCP-Ollama-Client"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.