awslabs/fortuna
A Library for Uncertainty Quantification.
ArchivedProvides framework-agnostic calibration and conformal prediction methods for classification and regression, plus Bayesian inference procedures for Flax models to quantify both aleatoric and epistemic uncertainty. Operates across three usage modes: post-hoc calibration of pre-trained model outputs, conformal prediction from existing uncertainty estimates, and end-to-end Bayesian training with JAX/Flax. Returns rigorous prediction sets with guaranteed coverage guarantees rather than point estimates, enabling safety-critical deployments.
921 stars. No commits in the last 6 months.
Stars
921
Forks
52
Language
Python
License
Apache-2.0
Last pushed
Apr 23, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/awslabs/fortuna"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
EmuKit/emukit
A Python-based toolbox of various methods in decision making, uncertainty quantification and...
google/uncertainty-baselines
High-quality implementations of standard and SOTA methods on a variety of tasks.
nielstron/quantulum3
Library for unit extraction - fork of quantulum for python3
IBM/UQ360
Uncertainty Quantification 360 (UQ360) is an extensible open-source toolkit that can help you...
aamini/evidential-deep-learning
Learn fast, scalable, and calibrated measures of uncertainty using neural networks!