SemiAnalysisAI/InferenceX-app
Dashboard for InferenceX™, Open Source Continuous Inference
26
/ 100
Experimental
No Package
No Dependents
Maintenance
13 / 25
Adoption
2 / 25
Maturity
11 / 25
Community
0 / 25
Stars
2
Forks
—
Language
TypeScript
License
GPL-3.0
Category
Last pushed
Mar 19, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/SemiAnalysisAI/InferenceX-app"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mosecorg/mosec
A high-performance ML model serving framework, offers dynamic batching and CPU/GPU pipelines to...
70
jeremyarancio/VLM-Batch-Deployment
Batch Deployment for Document Parsing with AWS Batch & Qwen-2.5-VL
36
amanparuthi8/gpu-llm-india-2026
Should you buy a DGX Spark or rent H100s? Run on Mac Mini or TAALAS cluster? Full cost &...
14
DunaSpice/JetsonMind
Production-ready AI inference system for NVIDIA Jetson devices with MCP integration, Docker...
13