jina-ai/serve
☁️ Build multimodal AI applications with cloud-native stack
Provides gRPC/HTTP/WebSocket service orchestration with native ML framework support and DocArray-based typed data handling. Scales from local development through Kubernetes with built-in dynamic batching, streaming LLM output, and containerization via Executor Hub. One-command cloud deployment to Jina AI Cloud, plus Docker Compose and Kubernetes export for enterprise environments.
21,848 stars. No commits in the last 6 months.
Stars
21,848
Forks
2,244
Language
Python
License
Apache-2.0
Category
Last pushed
Mar 24, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/jina-ai/serve"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
kserve/kserve
Standardized Distributed Generative and Predictive AI Inference Platform for Scalable,...
omegaml/omegaml
MLOps simplified. One-stop AI delivery platform, all the features you need.
awslabs/aiops-modules
AIOps modules is a collection of reusable Infrastructure as Code (IaC) modules for Machine...
george0st/qgate-sln-mlrun
MLRun/Iguazio/Nuclio quality gate solution. The solution checks a quality of MLRun...
demml/opsml
Quality Control for AI Artifact Management