All Projects → wjmaddox → Swa_gaussian

wjmaddox / Swa_gaussian

Licence: bsd-2-clause
Code repo for "A Simple Baseline for Bayesian Uncertainty in Deep Learning"

Projects that are alternatives of or similar to Swa gaussian

Cardiac Segmentation
Right Ventricle Cardiac MRI Segmentation
Stars: ✭ 253 (+0.4%)
Mutual labels:  jupyter-notebook
Football Crunching
Analysis and datasets about football (soccer)
Stars: ✭ 252 (+0%)
Mutual labels:  jupyter-notebook
Sipmask
SipMask: Spatial Information Preservation for Fast Image and Video Instance Segmentation (ECCV2020)
Stars: ✭ 255 (+1.19%)
Mutual labels:  jupyter-notebook
Cornell Conversational Analysis Toolkit
Stars: ✭ 250 (-0.79%)
Mutual labels:  jupyter-notebook
Complete Python 3 Bootcamp
Course Files for Complete Python 3 Bootcamp Course on Udemy
Stars: ✭ 18,322 (+7170.63%)
Mutual labels:  jupyter-notebook
Learning To Reweight Examples
PyTorch Implementation of the paper Learning to Reweight Examples for Robust Deep Learning
Stars: ✭ 255 (+1.19%)
Mutual labels:  jupyter-notebook
Gulius Projects
收录古柳(DesertsX)的一些小项目
Stars: ✭ 252 (+0%)
Mutual labels:  jupyter-notebook
2018 Daguan Competition
2018年"达观杯"文本智能处理挑战赛-长文本分类-rank4
Stars: ✭ 256 (+1.59%)
Mutual labels:  jupyter-notebook
Pytudes
Python programs, usually short, of considerable difficulty, to perfect particular skills.
Stars: ✭ 17,219 (+6732.94%)
Mutual labels:  jupyter-notebook
Stock Analysis
Regression, Scrapers, and Visualization
Stars: ✭ 255 (+1.19%)
Mutual labels:  jupyter-notebook
Visual genome python driver
A python wrapper for the Visual Genome API
Stars: ✭ 253 (+0.4%)
Mutual labels:  jupyter-notebook
Contextily
Context geo-tiles in Python
Stars: ✭ 254 (+0.79%)
Mutual labels:  jupyter-notebook
Deep Learning From Scratch
Six snippets of code that made deep learning what it is today.
Stars: ✭ 255 (+1.19%)
Mutual labels:  jupyter-notebook
Machinelearninginaction3x
Source Code for Machine Learning in Action for Python 3.X
Stars: ✭ 253 (+0.4%)
Mutual labels:  jupyter-notebook
Amazing Python Scripts
🚀 Curated collection of Amazing Python scripts from Basics to Advance with automation task scripts.
Stars: ✭ 229 (-9.13%)
Mutual labels:  jupyter-notebook
Speech recognition with tensorflow
Implementation of a seq2seq model for Speech Recognition using the latest version of TensorFlow. Architecture similar to Listen, Attend and Spell.
Stars: ✭ 253 (+0.4%)
Mutual labels:  jupyter-notebook
Keras Bert
A simple technique to integrate BERT from tf hub to keras
Stars: ✭ 255 (+1.19%)
Mutual labels:  jupyter-notebook
Dl Imperial Maths
Code and assignment repository for the Imperial College Mathematics department Deep Learning course
Stars: ✭ 256 (+1.59%)
Mutual labels:  jupyter-notebook
Causality
Stars: ✭ 252 (+0%)
Mutual labels:  jupyter-notebook
Spacy Notebooks
💫 Jupyter notebooks for spaCy examples and tutorials
Stars: ✭ 255 (+1.19%)
Mutual labels:  jupyter-notebook

A Simple Baseline for Bayesian Deep Learning

This repository contains a PyTorch implementation of Stochastic Weight Averaging-Gaussian (SWAG) from the paper

A Simple Baseline for Bayesian Uncertainty in Deep Learning

by Wesley Maddox, Timur Garipov, Pavel Izmailov, Dmitry Vetrov, and Andrew Gordon Wilson

Introduction

SWA-Gaussian (SWAG) is a convenient method for uncertainty representation and calibration in Bayesian deep learning. The key idea of SWAG is that the SGD iterates, with a modified learning rate schedule, act like samples from a Gaussian distribution; SWAG fits this Gaussian distribution by capturing the SWA mean and a covariance matrix, representing the first two moments of SGD iterates. We use this Gaussian distribution as a posterior over neural network weights, and then perform a Bayesian model average, for uncertainty representation and calibration.

In this repo, we implement SWAG for image classification with several different architectures on both CIFAR datasets and ImageNet. We also implement SWAG for semantic segmentation on CamVid using our implementation of a FCDenseNet67. We additionally include several other experiments on exploring the covariance of the gradients of the SGD iterates, the eigenvalues of the Hessian, and width/PCA decompositions of the SWAG approximate posterior.

CIFAR10 -> STL10 CIFAR100

Please cite our work if you find it useful:

@inproceedings{maddox_2019_simple,
  title={A simple baseline for bayesian uncertainty in deep learning},
  author={Maddox, Wesley J and Izmailov, Pavel and Garipov, Timur and Vetrov, Dmitry P and Wilson, Andrew Gordon},
  booktitle={Advances in Neural Information Processing Systems},
  pages={13153--13164},
  year={2019}
}

Installation:

python setup.py develop

See requirements.txt file for requirements that came from our setup. We use Pytorch 1.0.0 in our experiments.

Unless otherwise described, all experiments were run on a single GPU. Note that if you are using CUDA 10 you may need to manually install Pytorch with the correct CUDA toolkit.

File Structure

.
+-- swag/
|   +-- posteriors/
    |   +-- swag.py (class definition for SWA, SWAG and SWAG-Diag)
    |   +-- laplace.py (class definition for KFAC Laplace)
|   +-- models/ (Folder with all model definitions)
|   +-- utils.py (utility functions)
+-- experiments/
|   +-- train/ (folder containing standard training scripts for non-ImageNet data)
|   +-- imagenet/ (folder containing ImageNet training scripts)
|   +-- grad_cov/ (gradient covariance and optimal learning rate experiments)      

|   +-- hessian_eigs/ (folder for eigenvalues of hessian)

|   +-- segmentation/ (folder containing training scripts for segmentation experiments)
|   +-- uncertainty/ (folder containing scripts and methods for all uncertainty experiments)
|   +-- width/ (folder containing scripts for PCA and SVD of SGD trajectories)
+-- tests/ (folder containing tests for SWAG sampling and SWAG log-likelihood calculation.)

Example Commands

See experiments/ for particular READMEs*

Image Classification

Segmentation

Uncertainty

Some other commands are listed here:

Hessian eigenvalues

cd experiments/hessian_eigs; python run_hess_eigs.py --dataset CIFAR100 --data_path [data_path] --model PreResNet110 --use_test --file [ckpt] --save_path [output.npz]

Gradient covariances

cd experiments/grad_cov; python run_grad_cov.py --dataset CIFAR100 --data_path [data_path] --model VGG16 --use_test --epochs=300 --lr_init=0.05 --wd=5e-4 --swa --swa_start 161 --swa_lr=0.01 --grad_cov_start 251 --dir [dir]

Note that this will output the gradient covariances onto the console, so you ought to write these into a log file and retrieve them afterwards.

References for Code Base

Stochastic weight averaging: Pytorch repo; most of the base methods and model definitions are built off of this repo.

Model implementations:

Hessian eigenvalue computation: PyTorch repo, but we ultimately ended up using GPyTorch as it allows calculation of more eigenvalues.

Segmentation evaluation metrics: Lasagne repo

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].