Nikunj2003/LLaMa-MCP-Streamlit

AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).

35
/ 100
Emerging

Enables real-time tool execution through MCP's stdio transport, allowing the LLM to dynamically invoke external services (filesystem access, APIs) during conversations. Supports flexible backend configuration via NPX or Docker-based MCP servers, with switchable LLM endpoints (NVIDIA NIM or local Ollama). Built with Streamlit's reactive chat UI and Poetry for reproducible dependency management, including Docker deployment options.

No commits in the last 6 months.

No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 8 / 25
Maturity 8 / 25
Community 19 / 25

How are scores calculated?

Stars

43

Forks

18

Language

Python

License

Last pushed

Feb 09, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/Nikunj2003/LLaMa-MCP-Streamlit"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.