All Projects → altosaar → Variational Autoencoder

altosaar / Variational Autoencoder

Licence: mit
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Variational Autoencoder

Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (-48.2%)
Mutual labels:  unsupervised-learning, vae, variational-autoencoder, variational-inference
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-93.06%)
Mutual labels:  vae, unsupervised-learning, variational-autoencoder
ladder-vae-pytorch
Ladder Variational Autoencoders (LVAE) in PyTorch
Stars: ✭ 59 (-92.69%)
Mutual labels:  vae, unsupervised-learning, variational-inference
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-82.78%)
Mutual labels:  vae, variational-autoencoder, variational-inference
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (-50.68%)
Mutual labels:  unsupervised-learning, vae, variational-autoencoder
Ludwig
Data-centric declarative deep learning framework
Stars: ✭ 8,018 (+893.56%)
Mutual labels:  learning, deep-neural-networks, deep
classifying-vae-lstm
music generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (-96.65%)
Mutual labels:  vae, variational-autoencoder
S Vae Pytorch
Pytorch implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 255 (-68.4%)
Mutual labels:  vae, variational-autoencoder
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (-65.8%)
Mutual labels:  variational-autoencoder, variational-inference
Action Recognition Visual Attention
Action recognition using soft attention based deep recurrent neural networks
Stars: ✭ 350 (-56.63%)
Mutual labels:  deep-neural-networks, deep
CIKM18-LCVA
Code for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-98.39%)
Mutual labels:  variational-inference, variational-autoencoder
Jeelizar
JavaScript object detection lightweight library for augmented reality (WebXR demos included). It uses convolutional neural networks running on the GPU with WebGL.
Stars: ✭ 296 (-63.32%)
Mutual labels:  learning, deep
Awesome Cybersecurity Datasets
A curated list of amazingly awesome Cybersecurity datasets
Stars: ✭ 380 (-52.91%)
Mutual labels:  learning, deep
Pytorch Rl
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (-51.18%)
Mutual labels:  vae, variational-autoencoder
L2c
Learning to Cluster. A deep clustering strategy.
Stars: ✭ 262 (-67.53%)
Mutual labels:  deep-neural-networks, unsupervised-learning
lagvae
Lagrangian VAE
Stars: ✭ 27 (-96.65%)
Mutual labels:  variational-inference, variational-autoencoder
Beta Vae
Pytorch implementation of β-VAE
Stars: ✭ 326 (-59.6%)
Mutual labels:  unsupervised-learning, vae
Amazon Sagemaker Examples
Example 📓 Jupyter notebooks that demonstrate how to build, train, and deploy machine learning models using 🧠 Amazon SageMaker.
Stars: ✭ 6,346 (+686.37%)
Mutual labels:  learning, deep
sqair
Implementation of Sequential Attend, Infer, Repeat (SQAIR)
Stars: ✭ 96 (-88.1%)
Mutual labels:  vae, variational-inference
videoMultiGAN
End to End learning for Video Generation from Text
Stars: ✭ 53 (-93.43%)
Mutual labels:  learning, deep

Variational Autoencoder in tensorflow and pytorch

DOI

Reference implementation for a variational autoencoder in TensorFlow and PyTorch.

I recommend the PyTorch version. It includes an example of a more expressive variational family, the inverse autoregressive flow.

Variational inference is used to fit the model to binarized MNIST handwritten digits images. An inference network (encoder) is used to amortize the inference and share parameters across datapoints. The likelihood is parameterized by a generative network (decoder).

Blog post: https://jaan.io/what-is-variational-autoencoder-vae-tutorial/

Example output with importance sampling for estimating the marginal likelihood on Hugo Larochelle's Binary MNIST dataset. Finaly marginal likelihood on the test set of -97.10 nats.

$ python train_variational_autoencoder_pytorch.py --variational mean-field
step:   0       train elbo: -558.69
step:   0               valid elbo: -391.84     valid log p(x): -363.25
step:   5000    train elbo: -116.09
step:   5000            valid elbo: -112.57     valid log p(x): -107.01
step:   10000   train elbo: -105.82
step:   10000           valid elbo: -108.49     valid log p(x): -102.62
step:   15000   train elbo: -106.78
step:   15000           valid elbo: -106.97     valid log p(x): -100.97
step:   20000   train elbo: -108.43
step:   20000           valid elbo: -106.23     valid log p(x): -100.04
step:   25000   train elbo: -99.68
step:   25000           valid elbo: -104.89     valid log p(x): -98.83
step:   30000   train elbo: -96.71
step:   30000           valid elbo: -104.50     valid log p(x): -98.34
step:   35000   train elbo: -98.64
step:   35000           valid elbo: -104.05     valid log p(x): -97.87
step:   40000   train elbo: -93.60
step:   40000           valid elbo: -104.10     valid log p(x): -97.68
step:   45000   train elbo: -96.45
step:   45000           valid elbo: -104.58     valid log p(x): -97.76
step:   50000   train elbo: -101.63
step:   50000           valid elbo: -104.72     valid log p(x): -97.81
step:   55000   train elbo: -106.78
step:   55000           valid elbo: -105.14     valid log p(x): -98.06
step:   60000   train elbo: -100.58
step:   60000           valid elbo: -104.13     valid log p(x): -97.30
step:   65000   train elbo: -96.19
step:   65000           valid elbo: -104.46     valid log p(x): -97.43
step:   65000           test elbo: -103.31      test log p(x): -97.10

Using a non mean-field, more expressive variational posterior approximation (inverse autoregressive flow, https://arxiv.org/abs/1606.04934), the test marginal log-likelihood improves to -95.33 nats:

$ python train_variational_autoencoder_pytorch.py --variational flow
step:   0       train elbo: -578.35
step:   0               valid elbo: -407.06     valid log p(x): -367.88
step:   10000   train elbo: -106.63
step:   10000           valid elbo: -110.12     valid log p(x): -104.00
step:   20000   train elbo: -101.51
step:   20000           valid elbo: -105.02     valid log p(x): -99.11
step:   30000   train elbo: -98.70
step:   30000           valid elbo: -103.76     valid log p(x): -97.71
step:   40000   train elbo: -104.31
step:   40000           valid elbo: -103.71     valid log p(x): -97.27
step:   50000   train elbo: -97.20
step:   50000           valid elbo: -102.97     valid log p(x): -96.60
step:   60000   train elbo: -97.50
step:   60000           valid elbo: -102.82     valid log p(x): -96.49
step:   70000   train elbo: -94.68
step:   70000           valid elbo: -102.63     valid log p(x): -96.22
step:   80000   train elbo: -92.86
step:   80000           valid elbo: -102.53     valid log p(x): -96.09
step:   90000   train elbo: -93.83
step:   90000           valid elbo: -102.33     valid log p(x): -96.00
step:   100000  train elbo: -93.91
step:   100000          valid elbo: -102.48     valid log p(x): -95.92
step:   110000  train elbo: -94.34
step:   110000          valid elbo: -102.81     valid log p(x): -96.09
step:   120000  train elbo: -88.63
step:   120000          valid elbo: -102.53     valid log p(x): -95.80
step:   130000  train elbo: -96.61
step:   130000          valid elbo: -103.56     valid log p(x): -96.26
step:   140000  train elbo: -94.92
step:   140000          valid elbo: -102.81     valid log p(x): -95.86
step:   150000  train elbo: -97.84
step:   150000          valid elbo: -103.06     valid log p(x): -95.92
step:   150000          test elbo: -101.64      test log p(x): -95.33
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].