All Projects → hutec → Uncertaintynn

hutec / Uncertaintynn

Implementation and evaluation of different approaches to get uncertainty in neural networks

Projects that are alternatives of or similar to Uncertaintynn

Text Classification
An example on how to train supervised classifiers for multi-label text classification using sklearn pipelines
Stars: ✭ 100 (-0.99%)
Mutual labels:  jupyter-notebook
Codeinquarantine
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook
Spectralnormalizationkeras
Spectral Normalization for Keras Dense and Convolution Layers
Stars: ✭ 100 (-0.99%)
Mutual labels:  jupyter-notebook
Europilot
A toolkit for controlling Euro Truck Simulator 2 with python to develop self-driving algorithms.
Stars: ✭ 1,366 (+1252.48%)
Mutual labels:  jupyter-notebook
Noworkflow
Supporting infrastructure to run scientific experiments without a scientific workflow management system.
Stars: ✭ 100 (-0.99%)
Mutual labels:  jupyter-notebook
Mish Cuda
Mish Activation Function for PyTorch
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook
Data What Now
All codes from the DataWhatNow blog.
Stars: ✭ 100 (-0.99%)
Mutual labels:  jupyter-notebook
Nn From Scratch
Implementing a Neural Network from Scratch
Stars: ✭ 1,374 (+1260.4%)
Mutual labels:  jupyter-notebook
Unet
Generic U-Net Tensorflow 2 implementation for semantic segmentation
Stars: ✭ 100 (-0.99%)
Mutual labels:  jupyter-notebook
Genetic Algorithm
Genetic algorithm tutorial for Python
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook
Deep Learning Coursera
Deep Learning Specialization by Andrew Ng, deeplearning.ai.
Stars: ✭ 1,366 (+1252.48%)
Mutual labels:  jupyter-notebook
Sequana
Sequana: a set of Snakemake NGS pipelines
Stars: ✭ 100 (-0.99%)
Mutual labels:  jupyter-notebook
Hass Deepstack Face
Home Assistant custom component for using Deepstack face recognition
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook
Awesome Pytorch List Cnversion
Awesome-pytorch-list 翻译工作进行中......
Stars: ✭ 1,361 (+1247.52%)
Mutual labels:  jupyter-notebook
Cvnd localization exercises
Notebooks for learning about object motion and localization methods in the last section of CVND.
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook
Maps Location History
Get, Concatenate and Process you location history from Google Maps TimeLine
Stars: ✭ 99 (-1.98%)
Mutual labels:  jupyter-notebook
Airbnb Amenity Detection
Repo for 42 days project to replicate/improve Airbnb's amenity (object) detection pipeline.
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook
Traffic sign recognition efficient cnns
A repository for the paper "Real-Time Traffic Sign Recognition Based on Efficient CNNs in the Wild"
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook
Data Driven Discretization 1d
Code for "Learning data-driven discretizations for partial differential equations"
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook
Context aug
Context-driven data augmentation for Object Detection (ECCV'18)
Stars: ✭ 101 (+0%)
Mutual labels:  jupyter-notebook

Uncertainty in Neural Networks

This project contains different implementation and evaluations of approaches to model uncertainty in neural networks.

  • Bootstrapping-Method from Osband et al.
  • (Monte Carlo)Dropout-Method by Gal
  • Combined method (heteroscedastic aleatoric + epistemic) from Kendal & Gal
  • Mixture Density Networks as used by Choi et al.

Those models are evaluated on 2D data for function approximation. Specifically there is a dataset having six points at (-1,1) , (-1,-1), (0,1), (0,-1), (1,1), (1,-1) which shows problems with the Dropout and Combined Method. And a ''x + sin(x)'' function with added noise.

Datasets

Additionally the evaluation is done on MNIST data, for which I crafted adversarial attacks to evaluate the effectiveness of the Uncertainty methods.

Methods

Bootstrapping

The idea is to have a network with k distinct heads that share a network. The dataset is masked, so that every head only sees a subset of all data. Predictive variance and mean can then be gained by averaging over the prediction of every head.

https://arxiv.org/pdf/1602.04621v3.pdf

Monte-Carlo Dropout

Using dropout during training and test time which is approximate variational inference. Mean and variance is gained by doing stochastic forward passes (MC Dropout) and averaging over the outputs. This model can't differentiate between aleatoric and epistemic uncertainty.

http://mlg.eng.cam.ac.uk/yarin/thesis/thesis.pdf

Combined (Aleatoric + Epistemic Uncertainty) Method

Using Monte-Carlo Dropout and a modified loss function, you can get heteroscedastic aleatoric and epistemic uncertainty separated and also combine them.

https://arxiv.org/pdf/1703.04977.pdf

Mixture Density Networks

The last layer(s) are replaced by a layer that output Gaussian Mixture Models.

https://arxiv.org/pdf/1709.02249.pdf

Results

Results can be found in results or generated with

  • python evaluation/boostrap_evaluation.py
  • python evaluation/combined_evaluation.py
  • python evaluation/dropout_evaluation.py
  • python evaluation/mixture_evaluation.py

or all at once with python evaluation/evaluate_all.py

Bootstrap Results

TODO

  • Working with higher dimensional data (MNIST)
  • Analyze influence of adversarial attacks

Problems

  • When running on the command-line, you might have to set $PYTHONPATH to the root dir: export PYTHONPATH=.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].