All Projects → clear-nus → CHyVAE

clear-nus / CHyVAE

Licence: other
Code for our paper -- Hyperprior Induced Unsupervised Disentanglement of Latent Representations (AAAI 2019)

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to CHyVAE

linguistic-style-transfer-pytorch
Implementation of "Disentangled Representation Learning for Non-Parallel Text Style Transfer(ACL 2019)" in Pytorch
Stars: ✭ 55 (+205.56%)
Mutual labels:  variational-autoencoder, disentangled-representations, latent-representations
Li emnlp 2017
Deep Recurrent Generative Decoder for Abstractive Text Summarization in DyNet
Stars: ✭ 56 (+211.11%)
Mutual labels:  generative-model, variational-autoencoder
Simple Variational Autoencoder
A VAE written entirely in Numpy/Cupy
Stars: ✭ 20 (+11.11%)
Mutual labels:  generative-model, variational-autoencoder
Alae
[CVPR2020] Adversarial Latent Autoencoders
Stars: ✭ 3,178 (+17555.56%)
Mutual labels:  paper, generative-model
Neuraldialog Cvae
Tensorflow Implementation of Knowledge-Guided CVAE for dialog generation ACL 2017. It is released by Tiancheng Zhao (Tony) from Dialog Research Center, LTI, CMU
Stars: ✭ 279 (+1450%)
Mutual labels:  generative-model, variational-autoencoder
Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+20927.78%)
Mutual labels:  generative-model, variational-autoencoder
Vae For Image Generation
Implemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (+383.33%)
Mutual labels:  generative-model, variational-autoencoder
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (+216.67%)
Mutual labels:  generative-model, variational-autoencoder
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (+950%)
Mutual labels:  paper, generative-model
glico-learning-small-sample
Generative Latent Implicit Conditional Optimization when Learning from Small Sample ICPR 20'
Stars: ✭ 20 (+11.11%)
Mutual labels:  paper, generative-model
AC-VRNN
PyTorch code for CVIU paper "AC-VRNN: Attentive Conditional-VRNN for Multi-Future Trajectory Prediction"
Stars: ✭ 21 (+16.67%)
Mutual labels:  generative-model, variational-autoencoder
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+1433.33%)
Mutual labels:  generative-model, variational-autoencoder
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (+211.11%)
Mutual labels:  generative-model, variational-autoencoder
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+2222.22%)
Mutual labels:  generative-model, variational-autoencoder
Jukebox
Code for the paper "Jukebox: A Generative Model for Music"
Stars: ✭ 4,863 (+26916.67%)
Mutual labels:  paper, generative-model
eccv16 attr2img
Torch Implemention of ECCV'16 paper: Attribute2Image
Stars: ✭ 93 (+416.67%)
Mutual labels:  generative-model, variational-autoencoder
vae-torch
Variational autoencoder for anomaly detection (in PyTorch).
Stars: ✭ 38 (+111.11%)
Mutual labels:  generative-model, variational-autoencoder
DVAE
Official implementation of Dynamical VAEs
Stars: ✭ 75 (+316.67%)
Mutual labels:  generative-model
Islands
A spigot plugin for creating customisable home islands with different biomes. https://www.spigotmc.org/resources/islands-home-islands-system.84303/
Stars: ✭ 18 (+0%)
Mutual labels:  paper
timbre painting
Hierarchical fast and high-fidelity audio generation
Stars: ✭ 67 (+272.22%)
Mutual labels:  generative-model

CHyVAE

Code for our paper Hyperprior Induced Unsupervised Disentanglement of Latent Representations (AAAI-19). The correlated ellipses dataset used in the paper can be found here.

Requirements

  • Python 3
  • Tensorflow (tested on 1.10.1)
  • Numpy (tested on 1.14.5)
  • OpenCV (tested on 3.4.3)

Usage

Setting up the datasets

Traverse to data/ and run setup_2dshapes.sh and setup_corr-ell.sh to set up 2dshapes and correlated_ellipses datasets.

Training a model

Traverse to code/ and run

python main.py \
       --dataset [2dshapes/correlated_ellipses] \
       --z_dim [dim. of latent space] \
       --n_steps [number of training steps] \
       --nu [degrees of freedom] \
       --batch_size [batch size]

The reconstruction error and disentanglement metric will be logged at a set interval as training proceeds.

Example Run

python main.py --dataset correlated_ellipses --z_dim 10 --n_steps 150000 --nu 200 --batch_size 50

Run python main.py -h for help.

Datasets

Currently the repository includes code for experimenting on the following datasets.

  • 2DShapes
  • CorrelatedEllipses

Additional Results

For additonal qualitative results, please check AdditionalResults.md.

Contact

For any questions regarding the code or the paper, please email [email protected].

BibTeX

@inproceedings{ansari2019hyperprior,
  title={Hyperprior Induced Unsupervised Disentanglement of Latent Representations},
  author={Ansari, Abdul Fatir and Soh, Harold},
  booktitle={AAAI Conference on Artificial Intelligence},
  year={2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].