rb58853/fastchat-mcp
fastchat-mcp is a very simple way to interact with MCP servers using custom chats through natural language.
Implements a modular Python client built on the MCP SDK that routes natural language queries through OpenAI models (GPT-5 Nano by default) to execute MCP server tools via stdio or HTTPStream transports. The architecture supports flexible LLM provider and protocol expansion through configuration files, enabling tool invocation and result streaming without direct MCP protocol knowledge. Integrates with OpenAI's API and requires environment-based authentication to connect MCP servers defined in a JSON config.
Available on PyPI.
Stars
3
Forks
2
Language
Python
License
MIT
Category
Last pushed
Feb 14, 2026
Monthly downloads
62
Commits (30d)
0
Dependencies
9
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/rb58853/fastchat-mcp"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Related servers
keli-wen/mcp_chatbot
A chatbot implementation compatible with MCP (terminal / streamlit supported)
SecretiveShell/MCP-wolfram-alpha
Connect your chat repl to wolfram alpha computational intelligence
Elkhn/mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol...
alphasecio/mcp-client-server
A collection of MCP servers and a Streamlit-based MCP chatbot.
Nikunj2003/LLaMa-MCP-Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).