All Projects → cambridge-mlg → DUN

cambridge-mlg / DUN

Licence: MIT license
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to DUN

spatial-smoothing
(ICML 2022) Official PyTorch implementation of “Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness”.
Stars: ✭ 68 (+4.62%)
Mutual labels:  uncertainty, uncertainty-quantification, bayesian-neural-networks, robustness, bayesian-deep-learning
Bayesian Neural Networks
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Stars: ✭ 900 (+1284.62%)
Mutual labels:  reproducible-research, bayesian-inference, variational-inference
noisy-K-FAC
Natural Gradient, Variational Inference
Stars: ✭ 29 (-55.38%)
Mutual labels:  bayesian-inference, variational-inference, bayesian-neural-networks
artificial neural networks
A collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (-38.46%)
Mutual labels:  bayesian-inference, variational-inference, bayesian-neural-networks
uncertainty-wizard
Uncertainty-Wizard is a plugin on top of tensorflow.keras, allowing to easily and efficiently create uncertainty-aware deep neural networks. Also useful if you want to train multiple small models in parallel.
Stars: ✭ 39 (-40%)
Mutual labels:  uncertainty, uncertainty-neural-networks, uncertainty-quantification
Celeste.jl
Scalable inference for a generative model of astronomical images
Stars: ✭ 142 (+118.46%)
Mutual labels:  bayesian-inference, variational-inference
Rethinking Tensorflow Probability
Statistical Rethinking (2nd Ed) with Tensorflow Probability
Stars: ✭ 152 (+133.85%)
Mutual labels:  bayesian-inference, variational-inference
SafeAI
Reusable, Easy-to-use Uncertainty module package built with Tensorflow, Keras
Stars: ✭ 13 (-80%)
Mutual labels:  uncertainty, bayesian-neural-networks
torchuq
A library for uncertainty quantification based on PyTorch
Stars: ✭ 88 (+35.38%)
Mutual labels:  uncertainty, uncertainty-quantification
Topcuoglu ML mBio 2020
Best practices for applying machine learning to bacterial 16S rRNA gene sequencing data
Stars: ✭ 21 (-67.69%)
Mutual labels:  reproducible-research, reproducible-paper
Reproducibilty-Challenge-ECANET
Unofficial Implementation of ECANets (CVPR 2020) for the Reproducibility Challenge 2020.
Stars: ✭ 27 (-58.46%)
Mutual labels:  reproducible-research, reproducible-paper
Vbmc
Variational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference in MATLAB
Stars: ✭ 123 (+89.23%)
Mutual labels:  bayesian-inference, variational-inference
Bcpd
Bayesian Coherent Point Drift (BCPD/BCPD++); Source Code Available
Stars: ✭ 116 (+78.46%)
Mutual labels:  bayesian-inference, variational-inference
Probabilistic Models
Collection of probabilistic models and inference algorithms
Stars: ✭ 217 (+233.85%)
Mutual labels:  bayesian-inference, variational-inference
Gpstuff
GPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (+63.08%)
Mutual labels:  bayesian-inference, variational-inference
Mxfusion
Modular Probabilistic Programming on MXNet
Stars: ✭ 95 (+46.15%)
Mutual labels:  bayesian-inference, variational-inference
pre-training
Pre-Training Buys Better Robustness and Uncertainty Estimates (ICML 2019)
Stars: ✭ 90 (+38.46%)
Mutual labels:  uncertainty, robustness
SIVI
Using neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (-24.62%)
Mutual labels:  uncertainty-quantification, variational-inference
Pytorch Bayesiancnn
Bayesian Convolutional Neural Network with Variational Inference based on Bayes by Backprop in PyTorch.
Stars: ✭ 779 (+1098.46%)
Mutual labels:  bayesian-inference, variational-inference
Pyro
Deep universal probabilistic programming with Python and PyTorch
Stars: ✭ 7,224 (+11013.85%)
Mutual labels:  bayesian-inference, variational-inference

Depth Uncertainty in Neural Networks

Training a 10 layer DUN on our toy 'Matern' dataset.

arXiv Python 3.7+ Pytorch 1.3 License: MIT

Existing methods for estimating uncertainty in deep learning tend to require multiple forward passes, making them unsuitable for applications where computational resources are limited. To solve this, we perform probabilistic reasoning over the depth of neural networks. Different depths correspond to subnetworks which share weights and whose predictions are combined via marginalisation, yielding model uncertainty. By exploiting the sequential structure of feed-forward networks, we are able to both evaluate our training objective and make predictions with a single forward pass. We validate our approach on real-world regression and image classification tasks. Our approach provides uncertainty calibration, robustness to dataset shift, and accuracies competitive with more computationally expensive baselines.

Requirements

Python packages:

  • hpbandster 0.7.4
  • jupyter 1.0.0
  • matplotlib 3.2.1
  • numpy 1.18.3
  • pandas 0.24.2
  • Pillow 6.2.0
  • test-tube 0.7.5
  • torch 1.3.1
  • torchvision 0.4.2
  • tqdm 4.44.1

Running Experiments from the Paper

Install the DUN package, and its requirements, by running the following commands in the root directory of the project:

pip install -r requirements.txt
pip install -e .

Change to the experiments directory:

cd experiments

Toy Data Experiments

First change to the toy subdirectory:

cd experiments/toy

All experiments with toy data can be produced with the following script. Plots are generated automatically.

python train_toy.py --help

For example:

python train_toy.py --inference DUN --N_layers 10  --overcount 1  --width 100  --n_epochs 1000 --dataset wiggle --lr 0.001 --wd 0.0001

Regression Experiments

First change to the regression subdirectory:

cd experiments/regression

The regression experiments require 4 stages of computation:

  1. Hyperparameter optimisation
  2. Training models with the optimal hyperparameters
  3. Evaluating all models
  4. Plotting the results

For stage 1, change to the hyperparams subdirectory:

cd experiments/regression/hyperparams

Then run the following script for all combinations of datasets, splits, and inference methods of interest.

python run_opt.py --help

For example:

python run_opt.py --dataset boston --n_split 0 --min_budget 200 --max_budget 2000 --early_stop 200 --n_iterations 20 --run_id 0 --method SGD 

Note that a unique run_id must be suplied for each run.

For stage 2, change to the retrain_best subdirectory:

cd experiments/regression/retrain_best

Then run the following script for all combinations of datasets, splits, and inference methods for which hyperparameter optimisation was run.

python final_run.py --help

For example:

python final_train.py --dataset boston --split 0 --method SGD

For stage 3, go to the regression subdirectory:

cd experiments/regression

Run the following script.

python evaluate_final_models_unnorm.py --help

This script shouldn't require any command line arguments to run, for example:

python evaluate_final_models_unnorm.py

For stage 4, go to the experiments directory:

cd experiments

The regression experiment plots can now be generated by executing the appropriate cells in the regression_and_image_PLOTS.ipynb notebook. Launch jupyter:

jupyter-notebook

Note that the flights dataset must first be un-zipped before it can be used.

Image Experiments

First change to the image subdirectory:

cd experiments/image

The image experiments require 4 stages of computation:

  1. Training baselines
  2. Training DUNs
  3. Evaluate the models
  4. Plotting the results

For stage 1, run the train_baselines script, for each baseline configuration and dataset of interest:

python train_baselines.py --help

Note that, unlike the toy data and regression experiments, there are no flags to specify the inference method to use. Instead the inference method is implicit. For example to train an SGD model, we do not need to change any of the default arguments:

python train_baselines.py --dataset MNIST

To train a dropout model, just specify they p_drop and mcsamples arguments:

python train_baselines.py --dataset MNIST --p_drop 0.1 --mcsamples 10

To train an ensemble, simply train multiple SGD models. Each model will automatically be saved to a unique directory.

python train_baselines.py --dataset MNIST; 
python train_baselines.py --dataset MNIST; 
python train_baselines.py --dataset MNIST

For stage 2, run the train_DUN script, for each DUN configuration and dataset of interest:

python train_DUN.py --help

For example:

python train_DUN.py --dataset MNIST

For stage 3, run the run_final_image_experiments script for each inference method and dataset trained in the previous steps.

python run_final_image_experiments.py --help

For example:

python run_final_image_experiments.py --method=DUN --dataset=MNIST

or

python run_final_image_experiments.py --method=ensemble --dataset=CIFAR10

For stage 4, go to the experiments directory:

cd experiments

The image experiment plots can now be generated by executing the appropriate cells in the regression_and_image_PLOTS.ipynb notebook. Launch jupyter:

jupyter-notebook

Citation

If you find this code useful, please consider citing our paper:

Javier Antorán, James Urquhart Allingham, & José Miguel Hernández-Lobato. (2020). Depth Uncertainty in Neural Networks. [bibtex]

@misc{antoran2020depth,
    title={Depth Uncertainty in Neural Networks},
    author={Javier Antorán and James Urquhart Allingham and José Miguel Hernández-Lobato},
    year={2020},
    eprint={2006.08437},
    archivePrefix={arXiv},
    primaryClass={stat.ML}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].