All Projects → dtak → ocbnn-public

dtak / ocbnn-public

Licence: MIT License
General purpose library for BNNs, and implementation of OC-BNNs in our 2020 NeurIPS paper.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to ocbnn-public

SuperNNova
Open Source Photometric classification https://supernnova.readthedocs.io
Stars: ✭ 18 (-41.94%)
Mutual labels:  bayesian-neural-networks, bayesian-deep-learning
spatial-smoothing
(ICML 2022) Official PyTorch implementation of “Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness”.
Stars: ✭ 68 (+119.35%)
Mutual labels:  bayesian-neural-networks, bayesian-deep-learning
DUN
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (+109.68%)
Mutual labels:  bayesian-neural-networks, bayesian-deep-learning
Wavelet-like-Auto-Encoder
No description or website provided.
Stars: ✭ 61 (+96.77%)
Mutual labels:  paper
OneStopEnglishCorpus
No description or website provided.
Stars: ✭ 38 (+22.58%)
Mutual labels:  paper
Islands
A spigot plugin for creating customisable home islands with different biomes. https://www.spigotmc.org/resources/islands-home-islands-system.84303/
Stars: ✭ 18 (-41.94%)
Mutual labels:  paper
paperback
Paper backup generator suitable for long-term storage.
Stars: ✭ 517 (+1567.74%)
Mutual labels:  paper
CADA
Attending to Discriminative Certainty for Domain Adaptation
Stars: ✭ 17 (-45.16%)
Mutual labels:  bayesian-deep-learning
maml-rl-tf2
Implementation of Model-Agnostic Meta-Learning (MAML) applied on Reinforcement Learning problems in TensorFlow 2.
Stars: ✭ 16 (-48.39%)
Mutual labels:  paper
Facial-Recognition-Attendance-System
An attendance system which uses facial recognition to detect which people are present in any image.
Stars: ✭ 48 (+54.84%)
Mutual labels:  paper
INFO320
Neural Networks and Bayesian Learning
Stars: ✭ 24 (-22.58%)
Mutual labels:  bayesian-deep-learning
time series clustering via community detection
Code used in the paper "Time Series Clustering via Community Detection in Networks"
Stars: ✭ 27 (-12.9%)
Mutual labels:  paper
Lottery Ticket Hypothesis-TensorFlow 2
Implementing "The Lottery Ticket Hypothesis" paper by "Jonathan Frankle, Michael Carbin"
Stars: ✭ 28 (-9.68%)
Mutual labels:  paper
sensim
Sentence Similarity Estimator (SenSim)
Stars: ✭ 15 (-51.61%)
Mutual labels:  paper
fake-news-detection
This repo is a collection of AWESOME things about fake news detection, including papers, code, etc.
Stars: ✭ 34 (+9.68%)
Mutual labels:  paper
my-bookshelf
Collection of books/papers that I've read/I'm going to read/I would remember that they exist/It is unlikely that I'll read/I'll never read.
Stars: ✭ 49 (+58.06%)
Mutual labels:  paper
TMNet
The official pytorch implemention of the CVPR paper "Temporal Modulation Network for Controllable Space-Time Video Super-Resolution".
Stars: ✭ 77 (+148.39%)
Mutual labels:  paper
DRL in CV Papers
Research publications in Computer Vision Journals and Conferences (and arxiv) using RL.Visit Website:
Stars: ✭ 31 (+0%)
Mutual labels:  paper
paper-template
Collection of paper latex template for several computer vision related conference.
Stars: ✭ 63 (+103.23%)
Mutual labels:  paper
Movecraft
The original movement plugin for Bukkit. Reloaded. Again.
Stars: ✭ 79 (+154.84%)
Mutual labels:  paper

Output-Constrained Bayesian Neural Networks (OC-BNN)

This open-source repo is both (i) a general purpose implementation of BNNs, as well as (ii) an implementation of OC-BNNs, which are introduced in our NeurIPS 2020 paper as a way for users to specify output constraints into BNNs. In addition to reproducing the results in our paper, we hope that this codebase will be a helpful resource for researchers working with BNNs.

Feel free to send a pull request for bugs or extra features.

Our NeurIPS paper follows an earlier non-archival workshop paper in the 2019 ICML Workshop on Uncertainty and Robustness in Deep Learning. To see an earlier snapshot of this repo as released for the workshop paper, switch to the workshop branch.

Brief Introduction

If you're not familiar with BNNs, check out Chapter 1 of Wanqian's senior thesis for an introduction. Some other good resources are: Radford Neal's thesis and David MacKay's paper, both of which are early seminal works on BNNs. More recent (as of 2020) high-level overviews: a primer and paper, both by Andrew Gordon Wilson's group. Finally, the annual Bayesian Deep Learning workshop at NeurIPS is always a good resource for important contributions to the field.

The key challenge that our paper targets is imposing output constraints on BNNs, which constrain the posterior predictive distribution Y|X, D for some set of inputs X, e.g. if 1 < X < 2, then the distribution over Y should only contain probability mass on negative output values. We formulate various tractable prior distributions to allow the BNN to learn such constraints effectively. The ability to incorporate output constraints is useful because they are a currency for functional, interpretable knowledge. Model users, who may not have technical ML expertise, can easily specify constraints for prior knowledge they possess, that are not always reflected in the training distribution. For example, a doctor could specify the model to never predict certain classes of drugs if the patient's systolic blood pressure is below, say, 90 mm Hg.

For more details, check out our paper (linked above).

Getting Started

This codebase is written in Python 3.7.4 and built on top of PyTorch. The only setup instruction is to run the shell command pip install -r requirements.txt to install all dependencies. You might find it helpful to set up a virtual environment for managing dependencies. That's it, you are good to go!

The best way to start using the library is to check out and run run_tutorial.py in the root folder. It has an example that trains a vanilla BNN on a toy dataset, and then carries out prediction. It will also show how an output constraint may be specified and learnt.

In total, there are 4 scripts that you can run (see files themselves for optional command-line arguments):

  • run_tutorial.py: short example on how to use this codebase and its fuctionalities
  • run_toys.py: contains all synthetic experiments in Section 5 (and Appendix D) of our paper
  • run_apps.py: contains high-dimensional applications in Section 6 of our paper (except the MIMIC-III healthcare application in 6.1, for privacy reasons)
  • run_bakeoff.py: compares posterior predictive distributions of various BNN inference methods (see below) on the same examples

In particular, run_toys.py contains a comprehensive set of examples of all the various output-constrained priors in our paper, so if you want to implement a specific kind of constraint (e.g. positive or negative or probabilistic), check out the corresponding experiment.

The repro/ folder contains config files and pre-trained posterior samples of all our experiments, so you can run these scripts with the --pretrained flag to immediately generate the relevant plots/results without having to run posterior inference yourself.

Our codebase contains implementations of 4 different inference algorithms. Together, these represent a good diversity of both MCMC and variational methods:

  1. Hamiltonian Monte Carlo
  2. Stein Variational Gradient Descent
  3. Stochastic Gradient Langevin Dynamics
  4. Bayes by Backprop

Questions

How do I add my own dataset?

You must add a wrapper function in data/dataloader.py, check out the file's docstring for detailed instructions.

How do I add my own inference method?

You must add a mixin class in bnn/inference.py, check out the file's docstring for detailed instructions.

Where are hyperparameters and such defined?

A BNN object is instantiated with a YAML config file. See config.yaml in the root level of the repo for explanations of each hyperparameter.

How do I add a constraint to the BNN?

This is done by calling bnn.add_deterministic_constraint(...) or bnn.add_probabilistic_constraint(...). See the method docstrings in bnn/base.py for arguments.

How do I write my own prior distribution?

Add your own stuff to the bnn.log_prior() method in bnn/base.py. Currently, we've implemented the baseline isotropic Gaussian prior, as well as our own output-constrained priors. Note that both prior and likelihoods functions are in log-space.

Citation

@inproceedings{yang2020interpretable,
  title={Incorporating {I}nterpretable {O}utput {C}onstraints in {B}ayesian {N}eural {N}etworks},
  author={Yang, Wanqian and Lorch, Lars and Graule, Moritz A and Lakkaraju, Himabindu and Doshi-Velez, Finale},
  booktitle={Advances in {N}eural {I}nformation {P}rocessing {S}ystems},
  url={https://arxiv.org/abs/2010.10969},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].