tract and tensorrt-infer

These tools are competitors, as both `tract` and `tensorrt-infer` provide Rust interfaces for performing ONNX inference, but using different underlying inference engines (`tract`'s own engine or Tensorflow, and NVIDIA TensorRT, respectively).

tract
70
Verified
tensorrt-infer
27
Experimental
Maintenance 25/25
Adoption 10/25
Maturity 16/25
Community 19/25
Maintenance 13/25
Adoption 5/25
Maturity 9/25
Community 0/25
Stars: 2,818
Forks: 250
Downloads:
Commits (30d): 323
Language: Rust
License:
Stars: 1
Forks:
Downloads: 41
Commits (30d): 0
Language: Rust
License: MIT
No Package No Dependents
No Package No Dependents

About tract

sonos/tract

Tiny, no-nonsense, self-contained, Tensorflow and ONNX inference

Implements graph-level optimization passes (constant folding, operator fusion, quantization-aware transformations) and supports symbolic dimensions for dynamic shapes, enabling efficient inference on resource-constrained embedded systems. Built in Rust with zero external dependencies, it provides both a standalone CLI and language bindings (Python, C) for framework integration. Handles ONNX (85%+ operator coverage), TensorFlow 1.x, and NNEF formats with a production-focused subset philosophy that excludes rarely-used features like tensor sequences in favor of maintainability and performance.

About tensorrt-infer

LdDl/tensorrt-infer

Rust wrapper for NVIDIA TensorRT inference.

Scores updated daily from GitHub, PyPI, and npm data. How scores work