parthalon025/ollama-queue
Priority job queue and scheduler for local Ollama LLM inference β smart scheduling, DLQ, A/B eval pipeline, web dashboard.
22
/ 100
Experimental
No Package
No Dependents
Maintenance
13 / 25
Adoption
0 / 25
Maturity
9 / 25
Community
0 / 25
Stars
—
Forks
—
Language
Python
License
MIT
Category
Last pushed
Mar 19, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/parthalon025/ollama-queue"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ollama/ollama
Get up and running with Kimi-K2.5, GLM-5, MiniMax, DeepSeek, gpt-oss, Qwen, Gemma and other models.
71
jd-opensource/JDOxyGent4J
JDOxyGent4J: The Java sibling of the OxyGent ecosystem.
49
dext7r/ollama-api-pool
π Intelligent Ollama API proxy pool based on Cloudflare Workers - εΊδΊ Cloudflare Workers ηζΊθ½...
43
sammcj/gollama
Go manage your Ollama models
42
nandlabs/golly
golly is a open source library for go
40