FlowLLM-AI/flowllm
FlowLLM: Simplifying LLM-based HTTP/MCP Service Development
Exposes LLM, Embedding, and vector store capabilities as HTTP and MCP (Model Context Protocol) services through an Op-based architecture. Ops inherit from `BaseOp`/`BaseAsyncOp`, access models via lazy-initialized properties, and compose into Flows via YAML using serial (`>>`) and parallel (`|`) operators. Automatically generates RESTful APIs, MCP tools with async streaming support, and includes built-in token counting with multiple backends (OpenAI, Hugging Face).
32 stars and 175,152 monthly downloads. Used by 2 other packages. Available on PyPI.
Stars
32
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 18, 2026
Monthly downloads
175,152
Commits (30d)
0
Dependencies
19
Reverse dependents
2
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mcp/FlowLLM-AI/flowllm"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related servers
jonigl/ollama-mcp-bridge
Extend the Ollama API with dynamic AI tool integration from multiple MCP (Model Context...
CodeLogicIncEngineering/codelogic-mcp-server
An MCP Server to utilize Codelogic's rich software dependency data in your AI programming assistant.
sib-swiss/sparql-llm
🦜✨ Chat system, MCP server, and reusable components to improve LLMs capabilities when generating...
thedaviddias/mcp-llms-txt-explorer
MCP to explore websites with llms.txt files
webworn/openfoam-mcp-server
LLM-powered OpenFOAM MCP server for intelligent CFD education with Socratic questioning and...