All Projects → hwalsuklee → Tensorflow Mnist Vae

hwalsuklee / Tensorflow Mnist Vae

Tensorflow implementation of variational auto-encoder for MNIST

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Tensorflow Mnist Vae

Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-67.06%)
Mutual labels:  autoencoder, mnist, vae, variational-autoencoder
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (-5.69%)
Mutual labels:  mnist, vae, variational-autoencoder
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-95.73%)
Mutual labels:  mnist, autoencoder, variational-autoencoder
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-84.36%)
Mutual labels:  mnist, vae, variational-autoencoder
Vae Cvae Mnist
Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch
Stars: ✭ 229 (-45.73%)
Mutual labels:  mnist, vae, variational-autoencoder
Pytorch Mnist Vae
Stars: ✭ 32 (-92.42%)
Mutual labels:  autoencoder, mnist, vae
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (-75.83%)
Mutual labels:  autoencoder, vae, variational-autoencoder
Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+796.92%)
Mutual labels:  mnist, vae, variational-autoencoder
tensorflow-mnist-AAE
Tensorflow implementation of adversarial auto-encoder for MNIST
Stars: ✭ 86 (-79.62%)
Mutual labels:  mnist, autoencoder, vae
catseye
Neural network library written in C and Javascript
Stars: ✭ 29 (-93.13%)
Mutual labels:  mnist, autoencoder
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-94.79%)
Mutual labels:  vae, variational-autoencoder
shared-latent-space
Shared Latent Space VAE's
Stars: ✭ 15 (-96.45%)
Mutual labels:  autoencoder, variational-autoencoder
playing with vae
Comparing FC VAE / FCN VAE / PCA / UMAP on MNIST / FMNIST
Stars: ✭ 53 (-87.44%)
Mutual labels:  mnist, variational-autoencoder
probabilistic nlg
Tensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-93.36%)
Mutual labels:  autoencoder, vae
keras-adversarial-autoencoders
Experiments with Adversarial Autoencoders using Keras
Stars: ✭ 20 (-95.26%)
Mutual labels:  autoencoder, variational-autoencoder
Keras-Generating-Sentences-from-a-Continuous-Space
Text Variational Autoencoder inspired by the paper 'Generating Sentences from a Continuous Space' Bowman et al. https://arxiv.org/abs/1511.06349
Stars: ✭ 32 (-92.42%)
Mutual labels:  vae, variational-autoencoder
vae-pytorch
AE and VAE Playground in PyTorch
Stars: ✭ 53 (-87.44%)
Mutual labels:  autoencoder, variational-autoencoder
AutoEncoders
Variational autoencoder, denoising autoencoder and other variations of autoencoders implementation in keras
Stars: ✭ 14 (-96.68%)
Mutual labels:  autoencoder, variational-autoencoder
Pytorch Rl
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (-6.64%)
Mutual labels:  vae, variational-autoencoder
classifying-vae-lstm
music generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (-93.6%)
Mutual labels:  vae, variational-autoencoder

Variational Auto-Encoder for MNIST

An implementation of variational auto-encoder (VAE) for MNIST descripbed in the paper:

Results

Reproduce

Well trained VAE must be able to reproduce input image.
Figure 5 in the paper shows reproduce performance of learned generative models for different dimensionalities.
The following results can be reproduced with command:

python run_main.py --dim_z <each value> --num_epochs 60
Input image 2-D latent space 5-D latent space 10-D latent space 20-D latent space

Denoising

When training, salt & pepper noise is added to input image, so that VAE can reduce noise and restore original input image.
The following results can be reproduced with command:

python run_main.py --dim_z 20 --add_noise True --num_epochs 40
Original input image Input image with noise Restored image via VAE

Learned MNIST manifold

Visualizations of learned data manifold for generative models with 2-dim. latent space are given in Figure. 4 in the paper.
The following results can be reproduced with command:

python run_main.py --dim_z 2 --num_epochs 60 --PMLR True
Learned MNIST manifold Distribution of labeled data

Usage

Prerequisites

  1. Tensorflow
  2. Python packages : numpy, scipy, PIL(or Pillow), matplotlib

Command

python run_main.py --dim_z <latent vector dimension>

Example: python run_main.py --dim_z 20

Arguments

Required :

  • --dim_z: Dimension of latent vector. Default: 20

Optional :

  • --results_path: File path of output images. Default: results
  • --add_noise: Boolean for adding salt & pepper noise to input image. Default: False
  • --n_hidden: Number of hidden units in MLP. Default: 500
  • --learn_rate: Learning rate for Adam optimizer. Default: 1e-3
  • --num_epochs: The number of epochs to run. Default: 20
  • --batch_size: Batch size. Default: 128
  • --PRR: Boolean for plot-reproduce-result. Default: True
  • --PRR_n_img_x: Number of images along x-axis. Default: 10
  • --PRR_n_img_y: Number of images along y-axis. Default: 10
  • --PRR_resize_factor: Resize factor for each displayed image. Default: 1.0
  • --PMLR: Boolean for plot-manifold-learning-result. Default: False
  • --PMLR_n_img_x: Number of images along x-axis. Default: 20
  • --PMLR_n_img_y: Number of images along y-axis. Default: 20
  • --PMLR_resize_factor: Resize factor for each displayed image. Default: 1.0
  • --PMLR_n_samples: Number of samples in order to get distribution of labeled data. Default: 5000

References

The implementation is based on the projects:
[1] https://github.com/oduerr/dl_tutorial/tree/master/tensorflow/vae
[2] https://github.com/fastforwardlabs/vae-tf/tree/master
[3] https://github.com/kvfrans/variational-autoencoder
[4] https://github.com/altosaar/vae

Acknowledgements

This implementation has been tested with Tensorflow r0.12 on Windows 10.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].