FlowLLM-AI/flowllm

FlowLLM: Simplifying LLM-based HTTP/MCP Service Development

59
/ 100
Established

Exposes LLM, Embedding, and vector store capabilities as HTTP and MCP (Model Context Protocol) services through an Op-based architecture. Ops inherit from `BaseOp`/`BaseAsyncOp`, access models via lazy-initialized properties, and compose into Flows via YAML using serial (`>>`) and parallel (`|`) operators. Automatically generates RESTful APIs, MCP tools with async streaming support, and includes built-in token counting with multiple backends (OpenAI, Hugging Face).

32 stars and 175,152 monthly downloads. Used by 2 other packages. Available on PyPI.

Maintenance 10 / 25
Adoption 19 / 25
Maturity 24 / 25
Community 6 / 25

How are scores calculated?

Stars

32

Forks

2

Language

Python

License

Apache-2.0

Last pushed

Feb 18, 2026

Monthly downloads

175,152

Commits (30d)

0

Dependencies

19

Reverse dependents

2

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/mcp/FlowLLM-AI/flowllm"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.