All Projects → TorchUQ → torchuq

TorchUQ / torchuq

Licence: MIT license
A library for uncertainty quantification based on PyTorch

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to torchuq

awesome-conformal-prediction
A professionally curated list of awesome Conformal Prediction videos, tutorials, books, papers, PhD and MSc theses, articles and open-source libraries.
Stars: ✭ 998 (+1034.09%)
Mutual labels:  uncertainty, uncertainty-quantification
UQ360
Uncertainty Quantification 360 (UQ360) is an extensible open-source toolkit that can help you estimate, communicate and use uncertainty in machine learning model predictions.
Stars: ✭ 211 (+139.77%)
Mutual labels:  uncertainty, uncertainty-quantification
spatial-smoothing
(ICML 2022) Official PyTorch implementation of “Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness”.
Stars: ✭ 68 (-22.73%)
Mutual labels:  uncertainty, uncertainty-quantification
uncertainty-wizard
Uncertainty-Wizard is a plugin on top of tensorflow.keras, allowing to easily and efficiently create uncertainty-aware deep neural networks. Also useful if you want to train multiple small models in parallel.
Stars: ✭ 39 (-55.68%)
Mutual labels:  uncertainty, uncertainty-quantification
DUN
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (-26.14%)
Mutual labels:  uncertainty, uncertainty-quantification
Bioindustrial-Park
BioSTEAM's Premier Repository for Biorefinery Models and Results
Stars: ✭ 23 (-73.86%)
Mutual labels:  uncertainty
uapca
Uncertainty-aware principal component analysis.
Stars: ✭ 16 (-81.82%)
Mutual labels:  uncertainty
lolo
A random forest
Stars: ✭ 37 (-57.95%)
Mutual labels:  uncertainty
Topics-In-Modern-Statistical-Learning
Materials for STAT 991: Topics In Modern Statistical Learning (UPenn, 2022 Spring) - uncertainty quantification, conformal prediction, calibration, etc
Stars: ✭ 74 (-15.91%)
Mutual labels:  uncertainty-quantification
OpenCossan
OpenCossan is an open and free toolbox for uncertainty quantification and management.
Stars: ✭ 40 (-54.55%)
Mutual labels:  uncertainty-quantification
SIVI
Using neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (-44.32%)
Mutual labels:  uncertainty-quantification
DiffEqUncertainty.jl
Fast uncertainty quantification for scientific machine learning (SciML) and differential equations
Stars: ✭ 61 (-30.68%)
Mutual labels:  uncertainty-quantification
survHE
Survival analysis in health economic evaluation Contains a suite of functions to systematise the workflow involving survival analysis in health economic evaluation. survHE can fit a large range of survival models using both a frequentist approach (by calling the R package flexsurv) and a Bayesian perspective.
Stars: ✭ 32 (-63.64%)
Mutual labels:  uncertainty
torsionfit
Bayesian tools for fitting molecular mechanics torsion parameters to quantum chemical data.
Stars: ✭ 15 (-82.95%)
Mutual labels:  uncertainty-quantification
welleng
A collection of Wells/Drilling Engineering tools, focused on well trajectory planning for the time being.
Stars: ✭ 79 (-10.23%)
Mutual labels:  uncertainty
pestpp
tools for scalable and non-intrusive parameter estimation, uncertainty analysis and sensitivity analysis
Stars: ✭ 90 (+2.27%)
Mutual labels:  uncertainty-quantification
ww tvol study
Process global-scale satellite and airborne elevation data into time series of glacier mass change: Hugonnet et al. (2021).
Stars: ✭ 26 (-70.45%)
Mutual labels:  uncertainty-quantification
xdem
Analysis of digital elevation models (DEMs)
Stars: ✭ 50 (-43.18%)
Mutual labels:  uncertainty-quantification
sandy
Sampling nuclear data and uncertainty
Stars: ✭ 30 (-65.91%)
Mutual labels:  uncertainty
DecisionAmbiguityRecognition
Deep learning AI, that recognizes when are people uncertain
Stars: ✭ 16 (-81.82%)
Mutual labels:  uncertainty

TorchUQ

TorchUQ is an extensive library for uncertainty quantification (UQ) based on pytorch. TorchUQ currently supports 10 representations for uncertainty, and around 50 different methods for uncertainty evaluation and visualization, calibration and conformal prediction.

Why TorchUQ

Uncertainty quantification (UQ)—prediction models should know what they do not know—finds numerous applications in active learning, statistical inference, trustworthy machine learning, or in natural science and engineering applications that are rife with sources of uncertainty. TorchUQ aims to help both practitioners and researchers use UQ methods with ease.

For practitioners

TorchUQ provides an easy-to-use arsenal of uncertainty quantification methods with the following key features:

  • Plug and Play: Simple unified interface to access a large number of UQ methods.
  • Built on PyTorch: Native GPU & auto-diff support, seamless integration with deep learning pipelines.
  • Documentation: Detailed tutorial to walk through popular UQ algorithms. Extensive documentation.
  • Extensive: Supports calibration, conformal, multi-calibration, forecast evaluation, etc.

For researchers

TorchUQ provides a platform for conducting and distributing UQ research with the following key features:

  • Baselines: high quality implementation of many popular baseline methods to standardize comparison.
  • Benchmarks: a large number of datasets from recent UQ papers, retrieved by a unified interface.
  • Distribute your research: you are welcome to distribute your algorithms via the TorchUQ interface.

Installation

First download TorchUQ from pypi. To run the code, you can install the dependencies with the following command:

pip3 install requirements

pypi package link to come

Quickstart

We first import TorchUQ and the functions that we will use.

import torchuq
from torchuq.evaluate import distribution 
from torchuq.transform.conformal import ConformalCalibrator 
from torchuq.dataset import create_example_regression  

In this very simple example, we create a synthetic prediction (which is a set of Gaussian distributions) and recalibrate them with conformal calibration.

predictions, labels = create_example_regression()

The example predictions are intentionally incorrect (i.e. the label is not drawn from the predictions). We will recalibrate the distribution with a powerful recalibration algorithm called conformal calibration. It takes as input the predictions and the labels, and learns a recalibration map that can be applied to new data (here for illustration purposes we apply it to the original data).

calibrator = ConformalCalibrator(input_type='distribution', interpolation='linear')
calibrator.train(predictions, labels)
adjusted_predictions = calibrator(predictions)

We can plot these distribution predictions as a sequence of density functions, and the labels as the cross-shaped markers. As shown by the plot, the original predictions have systematically incorrect variance and mean, which is fixed by the recalibration algorithm.

distribution.plot_density_sequence(predictions, labels, smooth_bw=10)
distribution.plot_density_sequence(adjusted_predictions, labels, smooth_bw=10)

plot_original plot_calibrate

What's Next?

A good way to start is to read about the basic design philosophy and usage, then go through these tutorials. All the tutorials are interactive jupyter notebooks. You can either download them to run locally or view them statically here.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].