All Projects → EmilienDupont → vae-concrete

EmilienDupont / vae-concrete

Licence: other
Keras implementation of a Variational Auto Encoder with a Concrete Latent Distribution

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to vae-concrete

VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (+29.41%)
Mutual labels:  vae, variational-autoencoder, gumbel-softmax
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (+100%)
Mutual labels:  vae, variational-autoencoder
soft-intro-vae-pytorch
[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+233.33%)
Mutual labels:  vae, variational-autoencoder
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+162.75%)
Mutual labels:  vae, variational-autoencoder
Variational Recurrent Autoencoder Tensorflow
A tensorflow implementation of "Generating Sentences from a Continuous Space"
Stars: ✭ 228 (+347.06%)
Mutual labels:  vae, variational-autoencoder
Vae For Image Generation
Implemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (+70.59%)
Mutual labels:  vae, variational-autoencoder
Vae Tensorflow
A Tensorflow implementation of a Variational Autoencoder for the deep learning course at the University of Southern California (USC).
Stars: ✭ 117 (+129.41%)
Mutual labels:  vae, variational-autoencoder
Variational Autoencoder
PyTorch implementation of "Auto-Encoding Variational Bayes"
Stars: ✭ 25 (-50.98%)
Mutual labels:  vae, variational-autoencoder
Pytorch Vae
A Collection of Variational Autoencoders (VAE) in PyTorch.
Stars: ✭ 2,704 (+5201.96%)
Mutual labels:  vae, gumbel-softmax
MIDI-VAE
No description or website provided.
Stars: ✭ 56 (+9.8%)
Mutual labels:  vae, variational-autoencoder
Pytorch Vae
A CNN Variational Autoencoder (CNN-VAE) implemented in PyTorch
Stars: ✭ 181 (+254.9%)
Mutual labels:  vae, variational-autoencoder
Vae Cvae Mnist
Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch
Stars: ✭ 229 (+349.02%)
Mutual labels:  vae, variational-autoencoder
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+2274.51%)
Mutual labels:  vae, variational-autoencoder
Python World
Stars: ✭ 98 (+92.16%)
Mutual labels:  vae, variational-autoencoder
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (+11.76%)
Mutual labels:  vae, variational-autoencoder
Mojitalk
Code for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (+109.8%)
Mutual labels:  vae, variational-autoencoder
Video prediction
Stochastic Adversarial Video Prediction
Stars: ✭ 247 (+384.31%)
Mutual labels:  vae, variational-autoencoder
Tensorflow Mnist Vae
Tensorflow implementation of variational auto-encoder for MNIST
Stars: ✭ 422 (+727.45%)
Mutual labels:  vae, variational-autoencoder
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+1482.35%)
Mutual labels:  vae, variational-autoencoder
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+172.55%)
Mutual labels:  vae, variational-autoencoder

Variational Auto Encoder with Concrete Latent Distribution

Keras implementation of a Variational Auto Encoder with a Concrete latent distribution. See Auto-Encoding Variational Bayes by Kingma and Welling and The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables by Maddison, Mnih and Teh or Categorical Reparameterization with Gumbel-Softmax by Jang, Gu and Poole.

Examples

Samples from a regular VAE

VAE with concrete latent distribution. Each column of the image corresponds to one of the categories of the latent concrete distribution.

Usage

Traditional VAE with a 2 dimensional latent distribution

>>> from vae_concrete import VAE
>>> model = VAE(latent_cont_dim=2)
>>> model.fit(x_train, num_epochs=20)
>>> model.plot()

You should see start seeing good results after ~5 epochs. The loss should approach ~140 upon convergence. Occasionally the optimization gets stuck in a poor local minimum and stays around ~205. In that case it is best to just restart the optimization.

VAE with 2 continuous variables and a 10 dimensional discrete distribution

>>> model = VAE(latent_cont_dim=2, latent_disc_dim=10)
>>> model.fit(x_train, num_epochs=10)
>>> model.plot()

This takes ~10 epochs to start seeing good results. Loss should go down to ~125.

Dependencies

  • keras
  • tensorflow (only tested on tensorflow backend)
  • plotly

Acknowledgements

Code was inspired by the Keras VAE implementation (plotting functionality was also borrowed and modified from this example)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].