worker-vllm and runpod-worker-oobabooga
These are ecosystem siblings — both are RunPod worker templates that serve different LLM inference backends (vLLM vs. Oobabooga), allowing users to choose which serving framework best fits their model and performance requirements.
Maintenance
13/25
Adoption
10/25
Maturity
16/25
Community
25/25
Maintenance
13/25
Adoption
3/25
Maturity
9/25
Community
14/25
Stars: 406
Forks: 290
Downloads: —
Commits (30d): 0
Language: Python
License: MIT
Stars: 3
Forks: 3
Downloads: —
Commits (30d): 0
Language: Python
License: GPL-3.0
No Package
No Dependents
No Package
No Dependents
About worker-vllm
runpod-workers/worker-vllm
The RunPod worker template for serving our large language model endpoints. Powered by vLLM.
About runpod-worker-oobabooga
ashleykleynhans/runpod-worker-oobabooga
RunPod Serverless Worker for Oobabooga Text Generation API for LLMs
Scores updated daily from GitHub, PyPI, and npm data. How scores work