Uncertainty Toolbox: a Python toolbox for predictive uncertainty quantification, calibration, metrics, and visualization
-
Updated
Jul 9, 2024 - Python
Uncertainty Toolbox: a Python toolbox for predictive uncertainty quantification, calibration, metrics, and visualization
A Library for Uncertainty Quantification.
A collection of research and application papers of (uncertainty) calibration techniques.
A toolkit for visualizations in materials informatics.
Calibration library and code for the paper: Verified Uncertainty Calibration. Ananya Kumar, Percy Liang, Tengyu Ma. NeurIPS 2019 (Spotlight).
Code to accompany the paper 'Improving model calibration with accuracy versus uncertainty optimization'.
(ECCV 2022) BayesCap: Bayesian Identity Cap for Calibrated Uncertainty in Frozen Neural Networks
Code for evaluating uncertainty estimation methods for Transformer-based architectures in natural language understanding tasks.
A project to train your model from scratch or fine-tune a pretrained model using the losses provided in this library to improve out-of-distribution detection and uncertainty estimation performances. Calibrate your model to produce enhanced uncertainty estimations. Detect out-of-distribution data using the defined score type and threshold.
Source code for our paper: "LoGU: Long-form Generation with Uncertainty Expressions".
Calibration of Few-Shot Classification Tasks: Mitigating Misconfidence from Distribution Mismatch, IEEE Access vol.10
Service to examine data processing pipelines (e.g., machine learning or deep learning pipelines) for uncertainty consistency (calibration), fairness, and other safety-relevant aspects.
Truth Discovery Promotes Uncertainty Calibration of DNNs (UAI 2021)
Add a description, image, and links to the uncertainty-calibration topic page so that developers can more easily learn about it.
To associate your repository with the uncertainty-calibration topic, visit your repo's landing page and select "manage topics."