All Projects → ex4sperans → Variational Inference With Normalizing Flows

ex4sperans / Variational Inference With Normalizing Flows

Reimplementation of Variational Inference with Normalizing Flows (https://arxiv.org/abs/1505.05770)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Variational Inference With Normalizing Flows

Pytorch Bayesiancnn
Bayesian Convolutional Neural Network with Variational Inference based on Bayes by Backprop in PyTorch.
Stars: ✭ 779 (+433.56%)
Mutual labels:  variational-inference
Inverse rl
Adversarial Imitation Via Variational Inverse Reinforcement Learning
Stars: ✭ 79 (-45.89%)
Mutual labels:  variational-inference
Bayes By Backprop
PyTorch implementation of "Weight Uncertainty in Neural Networks"
Stars: ✭ 119 (-18.49%)
Mutual labels:  variational-inference
Bayesian Neural Networks
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Stars: ✭ 900 (+516.44%)
Mutual labels:  variational-inference
Rnn Vae
Variational Autoencoder with Recurrent Neural Network based on Google DeepMind's "DRAW: A Recurrent Neural Network For Image Generation"
Stars: ✭ 39 (-73.29%)
Mutual labels:  variational-inference
Gpflow
Gaussian processes in TensorFlow
Stars: ✭ 1,547 (+959.59%)
Mutual labels:  variational-inference
Pymc3
Probabilistic Programming in Python: Bayesian Modeling and Probabilistic Machine Learning with Aesara
Stars: ✭ 6,214 (+4156.16%)
Mutual labels:  variational-inference
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-4.79%)
Mutual labels:  variational-inference
Deepbayes 2018
Seminars DeepBayes Summer School 2018
Stars: ✭ 1,021 (+599.32%)
Mutual labels:  variational-inference
Bcpd
Bayesian Coherent Point Drift (BCPD/BCPD++); Source Code Available
Stars: ✭ 116 (-20.55%)
Mutual labels:  variational-inference
Variational gradient matching for dynamical systems
Sample code for the NIPS paper "Scalable Variational Inference for Dynamical Systems"
Stars: ✭ 22 (-84.93%)
Mutual labels:  variational-inference
Gp Infer Net
Scalable Training of Inference Networks for Gaussian-Process Models, ICML 2019
Stars: ✭ 37 (-74.66%)
Mutual labels:  variational-inference
Gpstuff
GPstuff - Gaussian process models for Bayesian analysis
Stars: ✭ 106 (-27.4%)
Mutual labels:  variational-inference
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+452.74%)
Mutual labels:  variational-inference
Vbmc
Variational Bayesian Monte Carlo (VBMC) algorithm for posterior and model inference in MATLAB
Stars: ✭ 123 (-15.75%)
Mutual labels:  variational-inference
Boltzmann Machines
Boltzmann Machines in TensorFlow with examples
Stars: ✭ 768 (+426.03%)
Mutual labels:  variational-inference
Mxfusion
Modular Probabilistic Programming on MXNet
Stars: ✭ 95 (-34.93%)
Mutual labels:  variational-inference
Celeste.jl
Scalable inference for a generative model of astronomical images
Stars: ✭ 142 (-2.74%)
Mutual labels:  variational-inference
Normalizing Flows
Understanding normalizing flows
Stars: ✭ 126 (-13.7%)
Mutual labels:  variational-inference
Kvae
Kalman Variational Auto-Encoder
Stars: ✭ 115 (-21.23%)
Mutual labels:  variational-inference

Variational Inference with Normalizing Flows

Reimplementation of Variational Inference with Normalizing Flows (https://arxiv.org/abs/1505.05770)

The idea is to approximate a complex multimodal probability density with a simple probability density followed by a sequence of invertible nonlinear transforms. Inference in such model requires a computation of multiple Jacobian determinants, that can be computationaly expensive. Authors propose a specific form of the transformation that reduces the cost of computing the Jacobians from approximately to where is the dimensionality of the data.

NOTE: Currently I provide implementation for the simple case, where the true density can be expressed in a closed form, so it's possible to explicitly minimize KL-divergence between the true density and the density represented by a normalizing flow. Implementing the most general case of normalizing which is capable of learning from the raw data is a bit problematic for the transformation described in the paper since inverse function for such transformation can not be expressed in a closed form. Currently I'm working on another kind of normalizing flow called Glow where all the transformations have closed-form inverse functions, and I'm planning to release it soon. Stay tuned!

I got the following results:

As can be seen, the approximation quality indeed increases as the flow length gets higher.

Reproducing my results

To reproduce my results, you will need to install pytorch.

Then you will need to install other dependencies from requirements.txt. If you are using pip, simply run pip install -r requirements.txt.

After you have installed the dependencies, run python run_experiment.py and collect the results in the experiments folder.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].