surajiitd/NVIDIA_Jetson_Inference
This repo contains model compression(using TensorRT) and documentation of running various deep learning models on NVIDIA Jetson Orin, Nano (aarch64 architectures)
No commits in the last 6 months.
Stars
9
Forks
3
Language
Makefile
License
—
Category
Last pushed
May 26, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/surajiitd/NVIDIA_Jetson_Inference"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
roboflow/inference
Turn any computer or edge device into a command center for your computer vision projects.
roboflow/roboflow-python
The official Roboflow Python package. Manage your datasets, models, and deployments. Roboflow...
hailo-ai/tappas
High-performance, optimized pre-trained template AI application pipelines for systems using Hailo devices
dusty-nv/jetson-inference
Hello AI World guide to deploying deep-learning inference networks and deep vision primitives...
open-edge-platform/geti
Build computer vision models in a fraction of the time and with less data.