All Projects → kevinzakka → vae-pytorch

kevinzakka / vae-pytorch

Licence: other
AE and VAE Playground in PyTorch

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to vae-pytorch

keras-adversarial-autoencoders
Experiments with Adversarial Autoencoders using Keras
Stars: ✭ 20 (-62.26%)
Mutual labels:  autoencoder, variational-autoencoder
Repo 2017
Python codes in Machine Learning, NLP, Deep Learning and Reinforcement Learning with Keras and Theano
Stars: ✭ 1,123 (+2018.87%)
Mutual labels:  autoencoder, variational-autoencoder
AutoEncoders
Variational autoencoder, denoising autoencoder and other variations of autoencoders implementation in keras
Stars: ✭ 14 (-73.58%)
Mutual labels:  autoencoder, variational-autoencoder
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-66.04%)
Mutual labels:  autoencoder, variational-autoencoder
Tybalt
Training and evaluating a variational autoencoder for pan-cancer gene expression data
Stars: ✭ 126 (+137.74%)
Mutual labels:  autoencoder, variational-autoencoder
Tensorflow Mnist Vae
Tensorflow implementation of variational auto-encoder for MNIST
Stars: ✭ 422 (+696.23%)
Mutual labels:  autoencoder, variational-autoencoder
Neurec
Next RecSys Library
Stars: ✭ 731 (+1279.25%)
Mutual labels:  autoencoder, variational-autoencoder
Codeslam
Implementation of CodeSLAM — Learning a Compact, Optimisable Representation for Dense Visual SLAM paper (https://arxiv.org/pdf/1804.00874.pdf)
Stars: ✭ 64 (+20.75%)
Mutual labels:  autoencoder, variational-autoencoder
Rectorch
rectorch is a pytorch-based framework for state-of-the-art top-N recommendation
Stars: ✭ 121 (+128.3%)
Mutual labels:  autoencoder, variational-autoencoder
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (+92.45%)
Mutual labels:  autoencoder, variational-autoencoder
Focal Frequency Loss
Focal Frequency Loss for Generative Models
Stars: ✭ 141 (+166.04%)
Mutual labels:  autoencoder, variational-autoencoder
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+162.26%)
Mutual labels:  autoencoder, variational-autoencoder
shared-latent-space
Shared Latent Space VAE's
Stars: ✭ 15 (-71.7%)
Mutual labels:  autoencoder, variational-autoencoder
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-58.49%)
Mutual labels:  variational-autoencoder
deep-steg
Global NIPS Paper Implementation Challenge of "Hiding Images in Plain Sight: Deep Steganography"
Stars: ✭ 43 (-18.87%)
Mutual labels:  autoencoder
Image deionising auto encoder
Noise removal from images using Convolutional autoencoder
Stars: ✭ 34 (-35.85%)
Mutual labels:  autoencoder
catseye
Neural network library written in C and Javascript
Stars: ✭ 29 (-45.28%)
Mutual labels:  autoencoder
multimodal-vae-public
A PyTorch implementation of "Multimodal Generative Models for Scalable Weakly-Supervised Learning" (https://arxiv.org/abs/1802.05335)
Stars: ✭ 98 (+84.91%)
Mutual labels:  variational-autoencoder
autoencoders tensorflow
Automatic feature engineering using deep learning and Bayesian inference using TensorFlow.
Stars: ✭ 66 (+24.53%)
Mutual labels:  autoencoder
CVAE Dial
CVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-69.81%)
Mutual labels:  variational-autoencoder

AE and VAE Playground

Disclaimer: VAE coming soon...

Remarks

The last activation of the decoder layer, the loss function, and the normalization scheme used on the training data are crucial for obtaining good reconstructions and preventing exploding negative losses.

  • If the data range is [-1, 1], then a tanh activation with an MSE loss does a good reconstruction job.
  • If the data range is [0, 1], then a sigmoid activation with a binary cross entropy loss does a good reconstruction job.

I assume that by smartly picking the activation function's range, we're helping the autoencoder's output more easily match the initial normalization distribution.

Simple fully-connected autoencoder (MSE)

Drawing

Simple fully-connected autoencoder with tanh (MSE)

Drawing

Simple fully-connected autoencoder (BCE)

Drawing

Simple fully-connected autoencoder with tanh and L1 regularization (MSE)

Drawing

Stacked 6 layer autoencoder (MSE)

Drawing

Stacked 6 layer autoencoder with tanh (MSE)

Drawing

Stacked 6 layer autoencoder (BCE)

Drawing

Convolutional autoencoder with tanh (MSE)

Drawing

Convolutional autoencoder (BCE)

Drawing

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].