Nikunj2003/LLaMa-MCP-Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
Enables real-time tool execution through MCP's stdio transport, allowing the LLM to dynamically invoke external services (filesystem access, APIs) during conversations. Supports flexible backend configuration via NPX or Docker-based MCP servers, with switchable LLM endpoints (NVIDIA NIM or local Ollama). Built with Streamlit's reactive chat UI and Poetry for reproducible dependency management, including Docker deployment options.
No commits in the last 6 months.
Stars
43
Forks
18
Language
Python
License
—
Category
Last pushed
Feb 09, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/Nikunj2003/LLaMa-MCP-Streamlit"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
keli-wen/mcp_chatbot
A chatbot implementation compatible with MCP (terminal / streamlit supported)
rb58853/fastchat-mcp
fastchat-mcp is a very simple way to interact with MCP servers using custom chats through...
SecretiveShell/MCP-wolfram-alpha
Connect your chat repl to wolfram alpha computational intelligence
Elkhn/mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol...
alphasecio/mcp-client-server
A collection of MCP servers and a Streamlit-based MCP chatbot.