All Projects → nadavbh12 → Vq Vae

nadavbh12 / Vq Vae

Licence: bsd-3-clause
Minimalist implementation of VQ-VAE in Pytorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Vq Vae

tensorflow-mnist-AAE
Tensorflow implementation of adversarial auto-encoder for MNIST
Stars: ✭ 86 (-61.61%)
Mutual labels:  mnist, vae
Tensorflow Mnist Vae
Tensorflow implementation of variational auto-encoder for MNIST
Stars: ✭ 422 (+88.39%)
Mutual labels:  mnist, vae
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-70.54%)
Mutual labels:  mnist, vae
Tf Vqvae
Tensorflow Implementation of the paper [Neural Discrete Representation Learning](https://arxiv.org/abs/1711.00937) (VQ-VAE).
Stars: ✭ 226 (+0.89%)
Mutual labels:  mnist, vae
Pytorch Mnist Vae
Stars: ✭ 32 (-85.71%)
Mutual labels:  mnist, vae
Fun-with-MNIST
Playing with MNIST. Machine Learning. Generative Models.
Stars: ✭ 23 (-89.73%)
Mutual labels:  mnist, vae
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (+77.68%)
Mutual labels:  mnist, vae
Vae Cvae Mnist
Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch
Stars: ✭ 229 (+2.23%)
Mutual labels:  mnist, vae
Randwire tensorflow
tensorflow implementation of Exploring Randomly Wired Neural Networks for Image Recognition
Stars: ✭ 29 (-87.05%)
Mutual labels:  deep-neural-networks, mnist
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+260.27%)
Mutual labels:  deep-neural-networks, vae
Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+1589.73%)
Mutual labels:  mnist, vae
Gpnd
Generative Probabilistic Novelty Detection with Adversarial Autoencoders
Stars: ✭ 112 (-50%)
Mutual labels:  deep-neural-networks, mnist
Androidtensorflowmnistexample
Android TensorFlow MachineLearning MNIST Example (Building Model with TensorFlow for Android)
Stars: ✭ 449 (+100.45%)
Mutual labels:  deep-neural-networks, mnist
Generative Models
Comparison of Generative Models in Tensorflow
Stars: ✭ 96 (-57.14%)
Mutual labels:  mnist, vae
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-37.95%)
Mutual labels:  mnist, vae
Learnopencv
Learn OpenCV : C++ and Python Examples
Stars: ✭ 15,385 (+6768.3%)
Mutual labels:  deep-neural-networks
50 Days Of Ml
A day to day plan for this challenge (50 Days of Machine Learning) . Covers both theoretical and practical aspects
Stars: ✭ 218 (-2.68%)
Mutual labels:  deep-neural-networks
Onednn
oneAPI Deep Neural Network Library (oneDNN)
Stars: ✭ 2,600 (+1060.71%)
Mutual labels:  deep-neural-networks
Oneflow
OneFlow is a performance-centered and open-source deep learning framework.
Stars: ✭ 2,868 (+1180.36%)
Mutual labels:  deep-neural-networks
Ml Workspace
Machine Learning (Beginners Hub), information(courses, books, cheat sheets, live sessions) related to machine learning, data science and python is available
Stars: ✭ 221 (-1.34%)
Mutual labels:  deep-neural-networks

CVAE and VQ-VAE

This is an implementation of the VQ-VAE (Vector Quantized Variational Autoencoder) and Convolutional Varational Autoencoder. from Neural Discrete representation learning for compressing MNIST and Cifar10. The code is based upon pytorch/examples/vae.

pip install -r requirements.txt
python main.py

requirements

  • Python 3.6 (maybe 3.5 will work as well)
  • PyTorch 0.4
  • Additional requirements in requirements.txt

Results

All images are taken from the test set. Top row is the original image. Bottom row is the reconstruction.

k - number of elements in the dictionary. d - dimension of elements in the dictionary (number of channels in bottleneck).

  • MNIST (k=10, d=64)

mnist

  • CIFAR10 (k=128, d=256)

CIFAR10

  • Imagenet (k=512, d=128)

imagenet

TODO:

  • [ ] Implement Continuous Relaxation Training of Discrete Latent Variable Image Models

  • [ ] Sample using PixelCNN prior

  • [ ] Improve results on cifar - nearest neighbor should be performed to 10 dictionaries rather than 1

  • [ ] Improve results on cifar - replace MSE with NLL

  • [ ] Improve results on cifar - measure bits/dim

  • [ ] Compare architecture with the offical one

  • [X] Merge VAE and VQ-VAE for MNIST and Cifar to one script

Acknowledgement

tf-vaevae for a good reference.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].