All Projects → IBM → UQ360

IBM / UQ360

Licence: Apache-2.0 license
Uncertainty Quantification 360 (UQ360) is an extensible open-source toolkit that can help you estimate, communicate and use uncertainty in machine learning model predictions.

Programming Languages

python
139335 projects - #7 most used programming language
r
7636 projects

Projects that are alternatives of or similar to UQ360

torchuq
A library for uncertainty quantification based on PyTorch
Stars: ✭ 88 (-58.29%)
Mutual labels:  uncertainty, uncertainty-quantification
spatial-smoothing
(ICML 2022) Official PyTorch implementation of “Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness”.
Stars: ✭ 68 (-67.77%)
Mutual labels:  uncertainty, uncertainty-quantification
pre-training
Pre-Training Buys Better Robustness and Uncertainty Estimates (ICML 2019)
Stars: ✭ 90 (-57.35%)
Mutual labels:  uncertainty, calibration
uncertainty-wizard
Uncertainty-Wizard is a plugin on top of tensorflow.keras, allowing to easily and efficiently create uncertainty-aware deep neural networks. Also useful if you want to train multiple small models in parallel.
Stars: ✭ 39 (-81.52%)
Mutual labels:  uncertainty, uncertainty-quantification
awesome-conformal-prediction
A professionally curated list of awesome Conformal Prediction videos, tutorials, books, papers, PhD and MSc theses, articles and open-source libraries.
Stars: ✭ 998 (+372.99%)
Mutual labels:  uncertainty, uncertainty-quantification
Topics-In-Modern-Statistical-Learning
Materials for STAT 991: Topics In Modern Statistical Learning (UPenn, 2022 Spring) - uncertainty quantification, conformal prediction, calibration, etc
Stars: ✭ 74 (-64.93%)
Mutual labels:  calibration, uncertainty-quantification
DUN
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (-69.19%)
Mutual labels:  uncertainty, uncertainty-quantification
Bioindustrial-Park
BioSTEAM's Premier Repository for Biorefinery Models and Results
Stars: ✭ 23 (-89.1%)
Mutual labels:  uncertainty
SafeAI
Reusable, Easy-to-use Uncertainty module package built with Tensorflow, Keras
Stars: ✭ 13 (-93.84%)
Mutual labels:  uncertainty
lolo
A random forest
Stars: ✭ 37 (-82.46%)
Mutual labels:  uncertainty
CalibrationWizard
[ICCV'19] Calibration Wizard: A Guidance System for Camera Calibration Based on Modelling Geometric and Corner Uncertainty
Stars: ✭ 80 (-62.09%)
Mutual labels:  uncertainty
timely-beliefs
Model data as beliefs (at a certain time) about events (at a certain time).
Stars: ✭ 15 (-92.89%)
Mutual labels:  uncertainty
survHE
Survival analysis in health economic evaluation Contains a suite of functions to systematise the workflow involving survival analysis in health economic evaluation. survHE can fit a large range of survival models using both a frequentist approach (by calling the R package flexsurv) and a Bayesian perspective.
Stars: ✭ 32 (-84.83%)
Mutual labels:  uncertainty
chemprop
Fast and scalable uncertainty quantification for neural molecular property prediction, accelerated optimization, and guided virtual screening.
Stars: ✭ 75 (-64.45%)
Mutual labels:  uncertainty
sandy
Sampling nuclear data and uncertainty
Stars: ✭ 30 (-85.78%)
Mutual labels:  uncertainty
MonoRUn
[CVPR'21] MonoRUn: Monocular 3D Object Detection by Reconstruction and Uncertainty Propagation
Stars: ✭ 85 (-59.72%)
Mutual labels:  uncertainty
uapca
Uncertainty-aware principal component analysis.
Stars: ✭ 16 (-92.42%)
Mutual labels:  uncertainty
welleng
A collection of Wells/Drilling Engineering tools, focused on well trajectory planning for the time being.
Stars: ✭ 79 (-62.56%)
Mutual labels:  uncertainty
ACSC
Automatic Calibration for Non-repetitive Scanning Solid-State LiDAR and Camera Systems
Stars: ✭ 210 (-0.47%)
Mutual labels:  calibration
ProSelfLC-2021
noisy labels; missing labels; semi-supervised learning; entropy; uncertainty; robustness and generalisation.
Stars: ✭ 45 (-78.67%)
Mutual labels:  uncertainty

UQ360

Build Status Documentation Status

The Uncertainty Quantification 360 (UQ360) is an open-source toolkit with a Python package to provide data science practitioners and developers access to state-of-the-art algorithms, to streamline the process of estimating, evaluating, improving, and communicating uncertainty of machine learning models as common practices for AI transparency. The UQ360 interactive experience provides a gentle introduction to the concepts and capabilities by walking through an example use case. The tutorials and example notebooks offer a deeper, data scientist-oriented introduction. The complete API is also available.

We have developed the package with extensibility in mind. This library is still in development. We encourage the contribution of your uncertainty estimation algorithms, metrics and applications. To get started as a contributor, please join the #uq360-users or #uq360-developers channel of the AIF360 Community on Slack by requesting an invitation here.

alt text

Resources

Example Use-cases

Meta-models

Use of meta-models to augment sklearn's gradient boosted regressor with prediction interval. See detailed example here.

from sklearn.ensemble import GradientBoostingRegressor
from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split

from uq360.algorithms.blackbox_metamodel import MetamodelRegression

# Create train, calibration and test splits.
X, y = make_regression(random_state=0)
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)
X_train, X_calibration, y_train, y_calibration = train_test_split(X_train, y_train, random_state=0)

# Train the base model that provides the mean estimates.
gbr_reg = GradientBoostingRegressor(random_state=0)
gbr_reg.fit(X_train, y_train)

# Train the meta-model that can augment the mean prediction with prediction intervals.
uq_model = MetamodelRegression(base_model=gbr_reg)
uq_model.fit(X_calibration, y_calibration, base_is_prefitted=True)

# Obtain mean estimates and prediction interval on the test data.
y_hat, y_hat_lb, y_hat_ub = uq_model.predict(X_test)

UQ360 metrics for model selection

The prediction interval coverage probability score (PICP) score is used here as the metric to select the model through cross-validation. See detailed example here.

from sklearn.datasets import make_regression
from sklearn.model_selection import train_test_split
from sklearn.model_selection import GridSearchCV
from uq360.utils.misc import make_sklearn_compatible_scorer
from uq360.algorithms.quantile_regression import QuantileRegression

# Create a sklearn scorer using UQ360 PICP metric.
sklearn_picp = make_sklearn_compatible_scorer(
    task_type="regression",
    metric="picp", greater_is_better=True)

# Hyper-parameters configuration using GridSearchCV.
base_config = {"alpha":0.95, "n_estimators":20, "max_depth": 3, 
               "learning_rate": 0.01, "min_samples_leaf": 10,
               "min_samples_split": 10}
configs  = {"config": []}
for num_estimators in [1, 2, 5, 10, 20, 30, 40, 50]:
    config = base_config.copy()
    config["n_estimators"] = num_estimators
    configs["config"].append(config)

# Create train test split.
X, y = make_regression(random_state=0)
X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)

# Initialize QuantileRegression UQ360 model and wrap it in GridSearchCV with PICP as the scoring function.
uq_model = GridSearchCV(
    QuantileRegression(config=base_config), configs, scoring=sklearn_picp)

# Fit the model on the training set.
uq_model.fit(X_train, y_train)

# Obtain the prediction intervals for the test set.
y_hat, y_hat_lb, y_hat_ub = uq_model.predict(X_test)

Setup

Supported Configurations:

OS Python version
macOS 3.7
Ubuntu 3.7
Windows 3.7

(Optional) Create a virtual environment

A virtual environment manager is strongly recommended to ensure dependencies may be installed safely. If you have trouble installing the toolkit, try this first.

Conda

Conda is recommended for all configurations though Virtualenv is generally interchangeable for our purposes. Miniconda is sufficient (see the difference between Anaconda and Miniconda if you are curious) and can be installed from here if you do not already have it.

Then, to create a new Python 3.7 environment, run:

conda create --name uq360 python=3.7
conda activate uq360

The shell should now look like (uq360) $. To deactivate the environment, run:

(uq360)$ conda deactivate

The prompt will return back to $ or (base)$.

Note: Older versions of conda may use source activate uq360 and source deactivate (activate uq360 and deactivate on Windows).

Installation

Clone the latest version of this repository:

(uq360)$ git clone https://github.ibm.com/UQ360/UQ360

If you'd like to run the examples and tutorial notebooks, download the datasets now and place them in their respective folders as described in uq360/data/README.md.

Then, navigate to the root directory of the project which contains setup.py file and run:

(uq360)$ pip install -e .

PIP Installation of Uncertainty Quantification 360

If you would like to quickly start using the UQ360 toolkit without cloning this repository, then you can install the uq360 pypi package as follows.

(your environment)$ pip install uq360

If you follow this approach, you may need to download the notebooks in the examples folder separately.

Using UQ360

The examples directory contains a diverse collection of jupyter notebooks that use UQ360 in various ways. Both examples and tutorial notebooks illustrate working code using the toolkit. Tutorials provide additional discussion that walks the user through the various steps of the notebook. See the details about tutorials and examples here.

Citing UQ360

A technical description of UQ360 is available in this paper. Below is the bibtex entry for this paper.

@misc{uq360-june-2021,
      title={Uncertainty Quantification 360: A Holistic Toolkit for Quantifying 
      and Communicating the Uncertainty of AI}, 
      author={Soumya Ghosh and Q. Vera Liao and Karthikeyan Natesan Ramamurthy 
      and Jiri Navratil and Prasanna Sattigeri 
      and Kush R. Varshney and Yunfeng Zhang},
      year={2021},
      eprint={2106.01410},
      archivePrefix={arXiv},
      primaryClass={cs.AI}
}

Acknowledgements

UQ360 is built with the help of several open source packages. All of these are listed in setup.py and some of these include:

License Information

Please view both the LICENSE file present in the root directory for license information.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].