All Projects → ermongroup → lagvae

ermongroup / lagvae

Licence: MIT License
Lagrangian VAE

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to lagvae

Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+922.22%)
Mutual labels:  generative-adversarial-network, variational-inference, variational-autoencoder
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+396.3%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-25.93%)
Mutual labels:  variational-inference, variational-autoencoder
CIKM18-LCVA
Code for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-51.85%)
Mutual labels:  variational-inference, variational-autoencoder
Deep Generative Models
Deep generative models implemented with TensorFlow 2.0: eg. Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN), Deep Boltzmann Machine (DBM), Convolutional Variational Auto-Encoder (CVAE), Convolutional Generative Adversarial Network (CGAN)
Stars: ✭ 34 (+25.93%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Repo 2017
Python codes in Machine Learning, NLP, Deep Learning and Reinforcement Learning with Keras and Theano
Stars: ✭ 1,123 (+4059.26%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Video prediction
Stochastic Adversarial Video Prediction
Stars: ✭ 247 (+814.81%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
tt-vae-gan
Timbre transfer with variational autoencoding and cycle-consistent adversarial networks. Able to transfer the timbre of an audio source to that of another.
Stars: ✭ 37 (+37.04%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
normalizing-flows
PyTorch implementation of normalizing flow models
Stars: ✭ 271 (+903.7%)
Mutual labels:  variational-inference, variational-autoencoder
precision-recall-distributions
Assessing Generative Models via Precision and Recall (official repository)
Stars: ✭ 80 (+196.3%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
deep-blueberry
If you've always wanted to learn about deep-learning but don't know where to start, then you might have stumbled upon the right place!
Stars: ✭ 17 (-37.04%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Pytorch Rl
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (+1359.26%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
vaegan
An implementation of VAEGAN (variational autoencoder + generative adversarial network).
Stars: ✭ 88 (+225.93%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Rectorch
rectorch is a pytorch-based framework for state-of-the-art top-N recommendation
Stars: ✭ 121 (+348.15%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Textbox
TextBox is an open-source library for building text generation system.
Stars: ✭ 257 (+851.85%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Focal Frequency Loss
Focal Frequency Loss for Generative Models
Stars: ✭ 141 (+422.22%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Normalizing Flows
Understanding normalizing flows
Stars: ✭ 126 (+366.67%)
Mutual labels:  variational-inference, variational-autoencoder
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+414.81%)
Mutual labels:  variational-inference, variational-autoencoder
SIVI
Using neural network to build expressive hierarchical distribution; A variational method to accurately estimate posterior uncertainty; A fast and general method for Bayesian inference. (ICML 2018)
Stars: ✭ 49 (+81.48%)
Mutual labels:  variational-inference, variational-autoencoder
Generative Continual Learning
No description or website provided.
Stars: ✭ 51 (+88.89%)
Mutual labels:  generative-adversarial-network, variational-autoencoder

Lagrangian VAE

TensorFlow implementation for the paper A Lagrangian Perspective of Latent Variable Generative Models, UAI 2018 Oral.

Shengjia Zhao, Jiaming Song and Stefano Ermon, Stanford Artificial Intelligence Laboratory

Overview

In this paper, we generalize the objective of latent variable generative models to two targets:

  • Primal Problem: "mutual information objectives", such as maximizing / minimizing mutual information between observations and latent variables.
  • Constraints: "consistency", which ensures that the model posterior is close to the amortized posterior.

Lagrangian VAE provides a practical way to find the best trade-off between "consistency constraints" and "mutual information objectives", as opposed of performing extensive hyperparameter tuning. We demonstrate an example over InfoVAE, a latent variable generative model objective that requires tuning the strengths of corresponding hyperparameters.

As demonstrated in the following figure, LagVAE manages to find a near Pareto-optimal curve for the trade-off between mutual informtation and consistency.

Requirements

  • click
  • gputil
  • tqdm

Files

  • methods/infovae.py: InfoVAE implementation (does not optimize Lagrange multiplers)
  • methods/lagvae.py: LagVAE implementation (optimization of Lagrange multipliers)

Examples

Please set environment variables EXP_LOG_PATH and DATA_PREFIX for logging experiments and downloading data prior to running the examples.

  • InfoVAE: python examples/infovae.py --mi=1.0 --e1=1.0 --e2=1.0
  • LagVAE: python examples/lagvae.py --mi=1.0 --e1=86.0 --e2=5.0

Note that we scale up MMD by 10000 in the implementation, so --e2=5.0 for LagVAE means MMD < 0.0005.

Feel free to play around with different VariationalEncoder, VariationalDecoder, optimizers, and datasets.

References

If you find the idea or code useful for your research, please consider citing our paper:

@article{zhao2018the,
  title={The Information Autoencoding Family: A Lagrangian Perspective on Latent Variable Generative Models},
  author={Zhao, Shengjia and Song, Jiaming and Ermon, Stefano},
  journal={arXiv preprint arXiv:1806.06514},
  year={2018}
}

Acknowledgements

utils/logger.py is based on an implementation in OpenAI Baselines.

Contact

tsong [at] cs [dot] stanford [dot] edu

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].