All Projects → ermongroup → Generalized-PixelVAE

ermongroup / Generalized-PixelVAE

Licence: MIT License
PixelVAE with or without regularization

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Generalized-PixelVAE

Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+331.25%)
Mutual labels:  generative-model, variational-inference
Variational Ladder Autoencoder
Implementation of VLAE
Stars: ✭ 196 (+206.25%)
Mutual labels:  generative-model, variational-inference
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+553.13%)
Mutual labels:  generative-model, variational-inference
AI Learning Hub
AI Learning Hub for Machine Learning, Deep Learning, Computer Vision and Statistics
Stars: ✭ 53 (-17.19%)
Mutual labels:  generative-model, variational-inference
Deep Generative Models For Natural Language Processing
DGMs for NLP. A roadmap.
Stars: ✭ 185 (+189.06%)
Mutual labels:  generative-model, variational-inference
adaptive-f-divergence
A tensorflow implementation of the NIPS 2018 paper "Variational Inference with Tail-adaptive f-Divergence"
Stars: ✭ 20 (-68.75%)
Mutual labels:  generative-model, variational-inference
artificial neural networks
A collection of Methods and Models for various architectures of Artificial Neural Networks
Stars: ✭ 40 (-37.5%)
Mutual labels:  variational-inference
noisy-K-FAC
Natural Gradient, Variational Inference
Stars: ✭ 29 (-54.69%)
Mutual labels:  variational-inference
gcWGAN
Guided Conditional Wasserstein GAN for De Novo Protein Design
Stars: ✭ 38 (-40.62%)
Mutual labels:  generative-model
py-msa-kdenlive
Python script to load a Kdenlive (OSS NLE video editor) project file, and conform the edit on video or numpy arrays.
Stars: ✭ 25 (-60.94%)
Mutual labels:  generative-model
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-68.75%)
Mutual labels:  variational-inference
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-71.87%)
Mutual labels:  variational-inference
probai-2021-pyro
Repo for the Tutorials of Day1-Day3 of the Nordic Probabilistic AI School 2021 (https://probabilistic.ai/)
Stars: ✭ 45 (-29.69%)
Mutual labels:  variational-inference
gans-in-action
"GAN 인 액션"(한빛미디어, 2020)의 코드 저장소입니다.
Stars: ✭ 29 (-54.69%)
Mutual labels:  generative-model
graph-nvp
GraphNVP: An Invertible Flow Model for Generating Molecular Graphs
Stars: ✭ 69 (+7.81%)
Mutual labels:  generative-model
VINF
Repository for DTU Special Course, focusing on Variational Inference using Normalizing Flows (VINF). Supervised by Michael Riis Andersen
Stars: ✭ 23 (-64.06%)
Mutual labels:  variational-inference
deep-active-inference-mc
Deep active inference agents using Monte-Carlo methods
Stars: ✭ 41 (-35.94%)
Mutual labels:  variational-inference
GrabNet
GrabNet: A Generative model to generate realistic 3D hands grasping unseen objects (ECCV2020)
Stars: ✭ 146 (+128.13%)
Mutual labels:  generative-model
char-VAE
Inspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-71.87%)
Mutual labels:  generative-model
prosper
A Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions
Stars: ✭ 17 (-73.44%)
Mutual labels:  variational-inference

Generalized VAE with PixelCNN Decoder

This repo implements the methods described in Towards a Deeper Understanding of Variational Autoencoding Models. A VAE with powerful decoding family such as PixelCNN tend to ignore latent code, and use only the decoding distribution to represent the entire dataset. This paper showed that this phenomenon is not general. For a more general family of VAE models, there are members that prefer to use the latent code. In particular, without any regularization on the posterior the model will prefer to use latent code. Furthermore we can still obtain correct samples, albeit only through a Markov chain.

  • Samples generated by model without regularization

mc_noreg

  • Samples generated by model with ELBO regularization

mc_elbo

Training with Default Options

Setup

Make sure you have the following installed

  • python 2 or 3 with numpy and scipy
  • tensorflow (Tested on tensorflow 0.12)

Train on CIFAR

To train on CIFAR with ELBO regularization

python train.py --use_autoencoder --save_dir=elbo --reg_type=elbo --gpus=0,1,2,3

To train on CIFAR without regularization

python train.py --use_autoencoder --save_dir=no_reg --reg_type=no_reg --gpus=0,1,2,3

You must replace the --gpus=[ids] to id of GPUs that are present in your system.

Additional Options

  • To use a particular GPU/GPUs add option --gpus=[ids] such as --gpus=0,1 to use GPU 0 and 1. Using Multi-GPUs is recommended
  • To specify batch size use --batch_size=[size]
  • To specify dimension of latent code use --latent_dim=[dim]
  • To specify the directory to place all checkpoint, logs and visualizations use --save_dir=/path/to/folder. To visualize with tensorboard set this directory as the logdir.
  • To use checkpoint file if one exists in model directory, use --load_params
  • For more options and their meaning please refer to the original PixelCNN++
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].