Truth Discovery Promotes Uncertainty Calibration of DNNs (UAI 2021)
-
Updated
Mar 3, 2022 - Python
Truth Discovery Promotes Uncertainty Calibration of DNNs (UAI 2021)
Service to examine data processing pipelines (e.g., machine learning or deep learning pipelines) for uncertainty consistency (calibration), fairness, and other safety-relevant aspects.
Calibration of Few-Shot Classification Tasks: Mitigating Misconfidence from Distribution Mismatch, IEEE Access vol.10
A project to train your model from scratch or fine-tune a pretrained model using the losses provided in this library to improve out-of-distribution detection and uncertainty estimation performances. Calibrate your model to produce enhanced uncertainty estimations. Detect out-of-distribution data using the defined score type and threshold.
Code for evaluating uncertainty estimation methods for Transformer-based architectures in natural language understanding tasks.
(ECCV 2022) BayesCap: Bayesian Identity Cap for Calibrated Uncertainty in Frozen Neural Networks
Code to accompany the paper 'Improving model calibration with accuracy versus uncertainty optimization'.
A toolkit for visualizations in materials informatics.
Calibration library and code for the paper: Verified Uncertainty Calibration. Ananya Kumar, Percy Liang, Tengyu Ma. NeurIPS 2019 (Spotlight).
A collection of research and application papers of (uncertainty) calibration techniques.
A Library for Uncertainty Quantification.
Uncertainty Toolbox: a Python toolbox for predictive uncertainty quantification, calibration, metrics, and visualization
Add a description, image, and links to the uncertainty-calibration topic page so that developers can more easily learn about it.
To associate your repository with the uncertainty-calibration topic, visit your repo's landing page and select "manage topics."