uncertainty-baselines and awesome-uncertainty-deeplearning

Maintenance 13/25
Adoption 10/25
Maturity 25/25
Community 22/25
Maintenance 6/25
Adoption 10/25
Maturity 16/25
Community 18/25
Stars: 1,568
Forks: 216
Downloads:
Commits (30d): 1
Language: Python
License: Apache-2.0
Stars: 787
Forks: 76
Downloads:
Commits (30d): 0
Language:
License: MIT
No risk flags
No Package No Dependents

About uncertainty-baselines

google/uncertainty-baselines

High-quality implementations of standard and SOTA methods on a variety of tasks.

This project offers standardized, high-quality implementations of methods for assessing and improving the reliability of machine learning models. It takes raw training data and model configurations, and outputs performance metrics like accuracy, calibration error, and negative log-likelihood. This tool is designed for machine learning researchers and practitioners who need to evaluate model robustness and uncertainty in a consistent way.

machine-learning-research model-robustness uncertainty-quantification predictive-modeling model-evaluation

About awesome-uncertainty-deeplearning

ENSTA-U2IS-AI/awesome-uncertainty-deeplearning

This repository contains a collection of surveys, datasets, papers, and codes, for predictive uncertainty estimation in deep learning models.

This is a curated collection of resources that helps machine learning practitioners understand and implement methods for estimating how certain their deep learning models are about their predictions. It provides a comprehensive list of papers, code examples, datasets, and surveys on uncertainty quantification techniques. Data scientists and AI researchers can use this to research the field of uncertainty in deep learning, to choose appropriate techniques, and to apply them to their models.

AI Safety Machine Learning Reliability Predictive Analytics Deep Learning Research Model Evaluation

Related comparisons

Scores updated daily from GitHub, PyPI, and npm data. How scores work