DunaSpice/JetsonMind
Production-ready AI inference system for NVIDIA Jetson devices with MCP integration, Docker containerization, and edge optimization. Sub-second startup, 99.9%+ reliability.
No commits in the last 6 months.
Stars
—
Forks
—
Language
TypeScript
License
MIT
Category
Last pushed
Sep 21, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/mlops/DunaSpice/JetsonMind"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mosecorg/mosec
A high-performance ML model serving framework, offers dynamic batching and CPU/GPU pipelines to...
jeremyarancio/VLM-Batch-Deployment
Batch Deployment for Document Parsing with AWS Batch & Qwen-2.5-VL
SemiAnalysisAI/InferenceX-app
Dashboard for InferenceX™, Open Source Continuous Inference
amanparuthi8/gpu-llm-india-2026
Should you buy a DGX Spark or rent H100s? Run on Mac Mini or TAALAS cluster? Full cost &...