All Projects → jhuggins → viabel

jhuggins / viabel

Licence: MIT License
Efficient, lightweight variational inference and approximation bounds

Programming Languages

python
139335 projects - #7 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to viabel

Mxfusion
Modular Probabilistic Programming on MXNet
Stars: ✭ 95 (+251.85%)
Mutual labels:  bayesian-inference, variational-inference
Celeste.jl
Scalable inference for a generative model of astronomical images
Stars: ✭ 142 (+425.93%)
Mutual labels:  bayesian-inference, variational-inference
Gpstuff
GPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (+292.59%)
Mutual labels:  bayesian-inference, variational-inference
Pytorch Bayesiancnn
Bayesian Convolutional Neural Network with Variational Inference based on Bayes by Backprop in PyTorch.
Stars: ✭ 779 (+2785.19%)
Mutual labels:  bayesian-inference, variational-inference
PyLDA
A Latent Dirichlet Allocation implementation in Python.
Stars: ✭ 51 (+88.89%)
Mutual labels:  bayesian-inference, variational-inference
Pyro
Deep universal probabilistic programming with Python and PyTorch
Stars: ✭ 7,224 (+26655.56%)
Mutual labels:  bayesian-inference, variational-inference
Vbmc
Variational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference in MATLAB
Stars: ✭ 123 (+355.56%)
Mutual labels:  bayesian-inference, variational-inference
Bayesian Neural Networks
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Stars: ✭ 900 (+3233.33%)
Mutual labels:  bayesian-inference, variational-inference
DUN
Code for "Depth Uncertainty in Neural Networks" (https://arxiv.org/abs/2006.08437)
Stars: ✭ 65 (+140.74%)
Mutual labels:  bayesian-inference, variational-inference
Probabilistic Models
Collection of probabilistic models and inference algorithms
Stars: ✭ 217 (+703.7%)
Mutual labels:  bayesian-inference, variational-inference
Pymc3
Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Aesara
Stars: ✭ 6,214 (+22914.81%)
Mutual labels:  bayesian-inference, variational-inference
artificial neural networks
A collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (+48.15%)
Mutual labels:  bayesian-inference, variational-inference
Bcpd
Bayesian Coherent Point Drift (BCPD/BCPD++); Source Code Available
Stars: ✭ 116 (+329.63%)
Mutual labels:  bayesian-inference, variational-inference
Rethinking Tensorflow Probability
Statistical Rethinking (2nd Ed) with Tensorflow Probability
Stars: ✭ 152 (+462.96%)
Mutual labels:  bayesian-inference, variational-inference
ReactiveMP.jl
Julia package for automatic Bayesian inference on a factor graph with reactive message passing
Stars: ✭ 58 (+114.81%)
Mutual labels:  bayesian-inference, variational-inference
noisy-K-FAC
Natural Gradient, Variational Inference
Stars: ✭ 29 (+7.41%)
Mutual labels:  bayesian-inference, variational-inference
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-25.93%)
Mutual labels:  variational-inference
lagvae
Lagrangian VAE
Stars: ✭ 27 (+0%)
Mutual labels:  variational-inference
Dropouts
PyTorch Implementations of Dropout Variants
Stars: ✭ 72 (+166.67%)
Mutual labels:  variational-inference
deep-active-inference-mc
Deep active inference agents using Monte-Carlo methods
Stars: ✭ 41 (+51.85%)
Mutual labels:  variational-inference

VIABEL: Variational Inference and Approximation Bounds that are Efficient and Lightweight

Build Status Code Coverage Documentation Status

VIABEL is a library (still in early development) that provides two types of functionality:

  1. A lightweight, flexible set of methods for variational inference that is agnostic to how the model is constructed. All that is required is a log density and its gradient.
  2. Methods for computing bounds on the errors of the mean, standard deviation, and variance estimates produced by a continuous approximation to an (unnormalized) distribution. A canonical application is a variational approximation to a Bayesian posterior distribution.

Documentation

For examples and API documentation, see readthedocs.

Installation

You can install the latest stable version using pip install viabel. Alternatively, you can clone the repository and use the master branch to get the most up-to-date version.

Citing VIABEL

If you use this package, please cite:

Validated Variational Inference via Practical Posterior Error Bounds. Jonathan H. Huggins, Mikołaj Kasprzak, Trevor Campbell, Tamara Broderick. In Proc. of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS), Palermo, Italy. PMLR: Volume 108, 2020.

The equivalent BibTeX entry is:

@inproceedings{Huggins:2020:VI,
  author = {Huggins, Jonathan H and Kasprzak, Miko{\l}aj and Campbell, Trevor and Broderick, Tamara},
  title = {{Validated Variational Inference via Practical Posterior Error Bounds}},
  booktitle = {Proc. of the 23rd International Conference on Artificial Intelligence and Statistics (AISTATS)},
  year = {2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].