All Projects → uncertainty-toolbox → Uncertainty Toolbox

uncertainty-toolbox / Uncertainty Toolbox

Licence: mit
A python toolbox for predictive uncertainty quantification, calibration, metrics, and visualization

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Uncertainty Toolbox

Laser Camera Calibration Toolbox
A Laser-Camera Calibration Toolbox extending from that at http://www.cs.cmu.edu/~ranjith/lcct.html
Stars: ✭ 99 (-88.75%)
Mutual labels:  calibration, toolbox
RaPId
RaPId (a recursive acronym for "Rapid Parameter Identification") utilizes different optimization and simulation technologies to provide a framework for model validation and calibration of any kind of dynamical systems, but specifically catered to power systems.
Stars: ✭ 35 (-96.02%)
Mutual labels:  toolbox, calibration
verified calibration
Calibration library and code for the paper: Verified Uncertainty Calibration. Ananya Kumar, Percy Liang, Tengyu Ma. NeurIPS 2019 (Spotlight).
Stars: ✭ 93 (-89.43%)
Mutual labels:  toolbox, calibration
plenopticam
Light-field imaging application for plenoptic cameras
Stars: ✭ 111 (-87.39%)
Mutual labels:  toolbox, calibration
Robotlib.jl
Robotics library written in the Julia programming language
Stars: ✭ 32 (-96.36%)
Mutual labels:  toolbox, calibration
Torch Toolbox
[Active development]ToolBox to make using Pytorch much easier.Give it a star if you feel helpful.
Stars: ✭ 268 (-69.55%)
Mutual labels:  toolbox, metrics
Xinshuo pytoolbox
A Python toolbox that contains common help functions for stream I/O, image & video processing, and visualization. All my projects depend on this toolbox.
Stars: ✭ 25 (-97.16%)
Mutual labels:  toolbox
Vector
A reliable, high-performance tool for building observability data pipelines.
Stars: ✭ 8,736 (+892.73%)
Mutual labels:  metrics
Browser Perf
Performance Metrics for Web Browsers
Stars: ✭ 930 (+5.68%)
Mutual labels:  metrics
Docker Go Graphite
Docker image for go-carbon + carbonapi + grafana
Stars: ✭ 23 (-97.39%)
Mutual labels:  metrics
Reproject Image To 3d
Comparing a OpenCV's reprojectImageTo3D to my own
Stars: ✭ 13 (-98.52%)
Mutual labels:  calibration
Pirate
Realtime metrics server written in Go
Stars: ✭ 11 (-98.75%)
Mutual labels:  metrics
Spm Agent Mongodb
Sematext Agent for monitoring MongoDB
Stars: ✭ 7 (-99.2%)
Mutual labels:  metrics
St handeye graph
General hand-eye calibration based on reprojection error minimization and pose graph optimization
Stars: ✭ 26 (-97.05%)
Mutual labels:  calibration
Fathom
Fathom Lite. Simple, privacy-focused website analytics. Built with Golang & Preact.
Stars: ✭ 6,989 (+694.2%)
Mutual labels:  metrics
Gin Stats
Gin's middleware for request stats
Stars: ✭ 24 (-97.27%)
Mutual labels:  metrics
Appmetrics
Node Application Metrics provides a foundational infrastructure for collecting resource and performance monitoring data for Node.js-based applications.
Stars: ✭ 864 (-1.82%)
Mutual labels:  metrics
Logmonitor
Monitoring log files on windows systems.
Stars: ✭ 23 (-97.39%)
Mutual labels:  metrics
D4s
Dynamo DB Database Done Scala-way
Stars: ✭ 27 (-96.93%)
Mutual labels:  metrics
Gtm
Simple, seamless, lightweight time tracking for Git
Stars: ✭ 857 (-2.61%)
Mutual labels:  metrics

Uncertainty Toolbox

A python toolbox for predictive uncertainty quantification, calibration, metrics, and visualization.
Also: a glossary of useful terms and a collection of relevant papers and references.

 
Many machine learning methods return predictions along with uncertainties of some form, such as distributions or confidence intervals. This begs the questions: How do we determine which predictive uncertanties are best? What does it mean to produce a best or ideal uncertainty? Are our uncertainties accurate and well calibrated?

Uncertainty Toolbox provides standard metrics to quantify and compare predictive uncertainty estimates, gives intuition for these metrics, produces visualizations of these metrics/uncertainties, and implements simple "re-calibration" procedures to improve these uncertainties. This toolbox currently focuses on regression tasks.

Toolbox Contents

Uncertainty Toolbox contains:

Installation

Uncertainty Toolbox requires Python 3.6+. To install, clone and cd into this repo, and run:

$ pip install -e .

Quick Start

import uncertainty_toolbox as uct

# Load an example dataset of 100 predictions, uncertainties, and observations
predictions, predictions_std, y, x = uct.data.synthetic_sine_heteroscedastic(100)

# Compute all uncertainty metrics
metrics = uct.metrics.get_all_metrics(predictions, predictions_std, y)

This example computes metrics for a vector of predicted values (predictions) and associated uncertainties (predictions_std, a vector of standard deviations), taken with respect to a corresponding set of observed values y.

Metrics

Uncertainty Toolbox provides a number of metrics to quantify and compare predictive uncertainty estimates. For example, the get_all_metrics function will return:

  1. average calibration: mean absolute calibration error, root mean squared calibration error, miscalibration area
  2. adversarial group calibration: mean absolute adversarial group calibration error, root mean squared adversarial group calibration error
  3. sharpness: expected standard deviation
  4. proper scoring rules: negative log-likelihood, continuous ranked probability score, check score, interval score
  5. accuracy: mean absolute error, root mean squared error, median absolute error, coefficient of determination, correlation

Visualizations

The following plots are a few of the visualizations provided by Uncertainty Toolbox. See this example for code to reproduce these plots.

Overconfident (too little uncertainty)

Underconfident (too much uncertainty)

Well calibrated

And here are a few of the calibration metrics for the above three cases:

Mean absolute calibration error (MACE) Root mean squared calibration error (RMSCE) Miscalibration area (MA)
Overconfident 0.19429 0.21753 0.19625
Underconfident 0.20692 0.23003 0.20901
Well calibrated 0.00862 0.01040 0.00865

Recalibration

The following plots show the results of a recalibration procedure provided by Uncertainty Toolbox, which transforms a set of predictive uncertainties to improve average calibration. The algorithm is based on isotonic regression, as proposed by Kuleshov et al.

See this example for code to reproduce these plots.

Recalibrating overconfident predictions

Mean absolute calibration error (MACE) Root mean squared calibration error (RMSCE) Miscalibration area (MA)
Before Recalibration 0.19429 0.21753 0.19625
After Recalibration 0.01124 0.02591 0.01117

Recalibrating underconfident predictions

Mean absolute calibration error (MACE) Root mean squared calibration error (RMSCE) Miscalibration area (MA)
Before Recalibration 0.20692 0.23003 0.20901
After Recalibration 0.00157 0.00205 0.00132

Citation

If you use this toolbox, please consider citing one of the papers that led to its development:

@article{chung2020beyond,
  title={Beyond Pinball Loss: Quantile Methods for Calibrated Uncertainty Quantification},
  author={Chung, Youngseog and Neiswanger, Willie and Char, Ian and Schneider, Jeff},
  journal={arXiv preprint arXiv:2011.09588},
  year={2020}
}

@article{tran2020methods,
  title={Methods for comparing uncertainty quantifications for material property predictions},
  author={Tran, Kevin and Neiswanger, Willie and Yoon, Junwoong and Zhang, Qingyang and Xing, Eric and Ulissi, Zachary W},
  journal={Machine Learning: Science and Technology},
  volume={1},
  number={2},
  pages={025006},
  year={2020},
  publisher={IOP Publishing}
}

Acknowledgments

Development of Uncertainty Toolbox is supported by the following organizations.

       

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].