All Projects → pawni → BayesByHypernet

pawni / BayesByHypernet

Licence: other
Code for the paper Implicit Weight Uncertainty in Neural Networks

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to BayesByHypernet

Dropouts
PyTorch Implementations of Dropout Variants
Stars: ✭ 72 (+14.29%)
Mutual labels:  variational-inference, bayesian-neural-networks
noisy-K-FAC
Natural Gradient, Variational Inference
Stars: ✭ 29 (-53.97%)
Mutual labels:  variational-inference, bayesian-neural-networks
artificial neural networks
A collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (-36.51%)
Mutual labels:  variational-inference, bayesian-neural-networks
DUN
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (+3.17%)
Mutual labels:  variational-inference, bayesian-neural-networks
AI Learning Hub
AI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (-15.87%)
Mutual labels:  variational-inference
boundary-gp
Know Your Boundaries: Constraining Gaussian Processes by Variational Harmonic Features
Stars: ✭ 21 (-66.67%)
Mutual labels:  variational-inference
Rethinking Numpyro
Statistical Rethinking (2nd ed.) with NumPyro
Stars: ✭ 225 (+257.14%)
Mutual labels:  variational-inference
Awesome Normalizing Flows
A list of awesome resources on normalizing flows.
Stars: ✭ 203 (+222.22%)
Mutual labels:  variational-inference
active-inference
A toy model of Friston's active inference in Tensorflow
Stars: ✭ 36 (-42.86%)
Mutual labels:  variational-inference
SafeAI
Reusable, Easy-to-use Uncertainty module package built with Tensorflow, Keras
Stars: ✭ 13 (-79.37%)
Mutual labels:  bayesian-neural-networks
BGCN
A Tensorflow implementation of "Bayesian Graph Convolutional Neural Networks" (AAAI 2019).
Stars: ✭ 129 (+104.76%)
Mutual labels:  bayesian-neural-networks
spatial-smoothing
(ICML 2022) Official PyTorch implementation of “Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness”.
Stars: ✭ 68 (+7.94%)
Mutual labels:  bayesian-neural-networks
rss
Regression with Summary Statistics.
Stars: ✭ 42 (-33.33%)
Mutual labels:  variational-inference
Good Papers
I try my best to keep updated cutting-edge knowledge in Machine Learning/Deep Learning and Natural Language Processing. These are my notes on some good papers
Stars: ✭ 248 (+293.65%)
Mutual labels:  variational-inference
PyLDA
A Latent Dirichlet Allocation implementation in Python.
Stars: ✭ 51 (-19.05%)
Mutual labels:  variational-inference
Probabilistic Models
Collection of probabilistic models and inference algorithms
Stars: ✭ 217 (+244.44%)
Mutual labels:  variational-inference
ccube
Bayesian mixture models for estimating and clustering cancer cell fractions
Stars: ✭ 23 (-63.49%)
Mutual labels:  variational-inference
adaptive-f-divergence
A tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"
Stars: ✭ 20 (-68.25%)
Mutual labels:  variational-inference
vireo
Demultiplexing pooled scRNA-seq data with or without genotype reference
Stars: ✭ 34 (-46.03%)
Mutual labels:  variational-inference
normalizing-flows
PyTorch implementation of normalizing flow models
Stars: ✭ 271 (+330.16%)
Mutual labels:  variational-inference

Implicit Weight Uncertainty in Neural Networks

This repository contains the code for the paper Implicit Weight Uncertainty in Neural Networks (arXiv).

There is a starting point of a reimplementation in Pytorch here.

Abstract

Modern neural networks tend to be overconfident on unseen, noisy or incorrectly labelled data and do not produce meaningful uncertainty measures. Bayesian deep learning aims to address this shortcoming with variational approximations (such as Bayes by Backprop or Multiplicative Normalising Flows). However, current approaches have limitations regarding flexibility and scalability. We introduce Bayes by Hypernet (BbH), a new method of variational approximation that interprets hypernetworks as implicit distributions. It naturally uses neural networks to model arbitrarily complex distributions and scales to modern deep learning architectures. In our experiments, we demonstrate that our method achieves competitive accuracies and predictive uncertainties on MNIST and a CIFAR5 task, while being the most robust against adversarial attacks.

Usage

Following libraries were used for development:

future==0.16.0
jupyter==1.0.0
matplotlib==2.2.2
notebook==5.0.0
numpy==1.14.3
observations==0.1.4
pandas==0.19.2
scikit-learn==0.19.1
scipy==1.1.0
seaborn==0.8.1
tensorflow-gpu==1.7.0
tqdm==4.19.5

Structure

toy_data.ipynb contains the code for the toy regression. The other files contain the code for the mnist and cifar experiments. run_* just calls the experiments. base_layers and layers implement easy to use layers for different VI methods. networks holds the models and the actual training and evaluation is in experiments and utils.

Contact

For discussion, suggestions or questions don't hesitate to contact [email protected] .

Commands to run experiments:

MNIST:

python run_bbh_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/bbh/ -x layer_a_prior1_fullkernel_noise8 --layer_wise_gen --noise_shape 8 -a --prior_scale 1. --full_kernel -c 0
python run_bbh_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/bbh/ -x layer_a_prior1_fullkernel_noise1 --layer_wise_gen --noise_shape 1 -a --prior_scale 1. --full_kernel -c 0
python run_bbh_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/bbh/ -x layer_a_prior1_fullkernel_noise64 --layer_wise_gen --noise_shape 64 -a --prior_scale 1. --full_kernel -c 0

python run_bbh_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/bbh/ -x layer_a_prior1_fullkernel_indnoise8 --independent_noise --layer_wise_gen --noise_shape 8 -a --prior_scale 1. --full_kernel -c 0
python run_bbh_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/bbh/ -x layer_a_prior1_fullkernel_indnoise1 --independent_noise --layer_wise_gen --noise_shape 1 -a --prior_scale 1. --full_kernel -c 0
python run_bbh_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/bbh/ -x layer_a_prior1_fullkernel_indnoise64 --independent_noise --layer_wise_gen --noise_shape 64 -a --prior_scale 1. --full_kernel -c 0

python run_dropout_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/dropout/ -x dropout_standard -c 0
python run_map_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/map/ -x map_standard -c 0
python run_ensemble_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/ensemble/ -x ensemble_standard -c 1
python run_mnf_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/mnf/ -x mnf_a -a -c 3
python run_bbb_exp.py -e 100 -p /vol/biomedic2/np716/bbh_uai/mnist/bbb/ -x prior_1 --prior_scale 1. -c 4



CIFAR:


python run_dropout_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/dropout/ -x dropout_standard -c 0
python run_map_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/map/ -x map_standard -c 0
python run_bbb_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/bbb/ -x prior_1 --prior_scale 1. -c 2
python run_ensemble_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/ensemble/ -x ensemble_standard -c 0
python run_mnf_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/mnf/ -x mnf_a -a -c 1
python run_bbh_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/bbh/ -x layer_a_prior1_fullkernel_noise8 --layer_wise_gen --noise_shape 8 -a --prior_scale 1. --full_kernel -c 1
python run_bbh_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/bbh/ -x layer_a_prior1_fullkernel_noise1 --layer_wise_gen --noise_shape 1 -a --prior_scale 1. --full_kernel -c 0
python run_bbh_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/bbh/ -x layer_a_prior1_fullkernel_noise64 --layer_wise_gen --noise_shape 64 -a --prior_scale 1. --full_kernel -c 0

python run_bbh_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/bbh/ -x layer_a_prior1_fullkernel_indnoise8 --independent_noise --layer_wise_gen --noise_shape 8 -a --prior_scale 1. --full_kernel -c 0
python run_bbh_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/bbh/ -x layer_a_prior1_fullkernel_indnoise1 --independent_noise --layer_wise_gen --noise_shape 1 -a --prior_scale 1. --full_kernel -c 0
python run_bbh_cifar_resnet_exp.py -e 200 -p /vol/biomedic2/np716/bbh_uai/cifar/bbh/ -x layer_a_prior1_fullkernel_indnoise64 --independent_noise --layer_wise_gen --noise_shape 64 -a --prior_scale 1. --full_kernel -c 0


Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].