All Projects → abdulfatir → Normalizing Flows

abdulfatir / Normalizing Flows

Licence: mit
Understanding normalizing flows

Projects that are alternatives of or similar to Normalizing Flows

Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+119.05%)
Mutual labels:  jupyter-notebook, variational-autoencoder, variational-inference
Boltzmann Machines
Boltzmann Machines in TensorFlow with examples
Stars: ✭ 768 (+509.52%)
Mutual labels:  jupyter-notebook, variational-inference
Probabilistic unet
A U-Net combined with a variational auto-encoder that is able to learn conditional distributions over semantic segmentations.
Stars: ✭ 427 (+238.89%)
Mutual labels:  jupyter-notebook, variational-inference
Variational gradient matching for dynamical systems
Sample code for the NIPS paper "Scalable Variational Inference for Dynamical Systems"
Stars: ✭ 22 (-82.54%)
Mutual labels:  jupyter-notebook, variational-inference
lagvae
Lagrangian VAE
Stars: ✭ 27 (-78.57%)
Mutual labels:  variational-inference, variational-autoencoder
Vae cf
Variational autoencoders for collaborative filtering
Stars: ✭ 386 (+206.35%)
Mutual labels:  jupyter-notebook, variational-autoencoder
Bayesian Neural Networks
Pytorch implementations of Bayes By Backprop, MC Dropout, SGLD, the Local Reparametrization Trick, KF-Laplace, SG-HMC and more
Stars: ✭ 900 (+614.29%)
Mutual labels:  jupyter-notebook, variational-inference
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-85.71%)
Mutual labels:  variational-inference, variational-autoencoder
Deepbayes 2018
Seminars DeepBayes Summer School 2018
Stars: ✭ 1,021 (+710.32%)
Mutual labels:  jupyter-notebook, variational-inference
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (-54.76%)
Mutual labels:  jupyter-notebook, variational-autoencoder
Bayesian Machine Learning
Notebooks about Bayesian methods for machine learning
Stars: ✭ 1,202 (+853.97%)
Mutual labels:  jupyter-notebook, variational-autoencoder
CIKM18-LCVA
Code for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-89.68%)
Mutual labels:  variational-inference, variational-autoencoder
Vae Tensorflow
A Tensorflow implementation of a Variational Autoencoder for the deep learning course at the University of Southern California (USC).
Stars: ✭ 117 (-7.14%)
Mutual labels:  jupyter-notebook, variational-autoencoder
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+231.75%)
Mutual labels:  variational-autoencoder, variational-inference
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-84.13%)
Mutual labels:  variational-inference, variational-autoencoder
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+540.48%)
Mutual labels:  variational-autoencoder, variational-inference
SIVI
Using neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (-61.11%)
Mutual labels:  variational-inference, variational-autoencoder
normalizing-flows
PyTorch implementation of normalizing flow models
Stars: ✭ 271 (+115.08%)
Mutual labels:  variational-inference, variational-autoencoder
Variational Autoencoder
PyTorch implementation of "Auto-Encoding Variational Bayes"
Stars: ✭ 25 (-80.16%)
Mutual labels:  jupyter-notebook, variational-autoencoder
Kvae
Kalman Variational Auto-Encoder
Stars: ✭ 115 (-8.73%)
Mutual labels:  variational-autoencoder, variational-inference

Normalizing Flows

Note: There are some bugs in the implementation of VAE+PF. For an updated Pytorch implementation, please check: abdulfatir/planar-flow-pytorch.

Accompanying documentation

  1. Normalizing Flows: Planar and Radial Flows
  2. Technical Report

Results

Variational Inference with Normalizing Flows (Rezende and Mohamed)

The function that planar flow uses doesn't have analytic inverse which makes it unsuitable for direct likelihood estimation using the data. It can work well in VAEs though because inversion isn't required. However, for cases when the analytic target density is available, KL-divergence can be minimized explicitly (excluding constant terms). Following are the results for two complex 2D densities similar to the ones in the paper. The second column shows samples obtained using MCMC (See notebooks/Metropolis-Hastings.ipynb). The third and fourth columns show results using planar flows of different lengths (see notebooks/PlanarFlow-Example1.ipynb and notebooks/PlanarFlow-Example2.ipynb). During the experiments I found that minimizing KL is not always stable and doesn't always reach a good solution (especially for density 2).

True Density Samples (using Metropolis-Hastings) Samples (using Planar Flow; K = 4) Samples (using Planar Flow; K = 16)

VAE with Planar Flow

[REMOVED]

There are some bugs in the implementation of VAE+PF. For an updated Pytorch implementation, please check: abdulfatir/planar-flow-pytorch.

Note

Originally, this repository contained notes and code on normalizing flows which we did as a part of a course project (CS6202 @ NUS). Some ideas are borrowed from this repo.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].