All Projects → jmtomczak → Vae_vampprior

jmtomczak / Vae_vampprior

Licence: mit
Code for the paper "VAE with a VampPrior", J.M. Tomczak & M. Welling

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Vae vampprior

RG-Flow
This is project page for the paper "RG-Flow: a hierarchical and explainable flow model based on renormalization group and sparse prior". Paper link: https://arxiv.org/abs/2010.00029
Stars: ✭ 58 (-66.47%)
Mutual labels:  generative-model, representation-learning
Variational Ladder Autoencoder
Implementation of VLAE
Stars: ✭ 196 (+13.29%)
Mutual labels:  generative-model, representation-learning
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-67.63%)
Mutual labels:  generative-model, representation-learning
ShapeFormer
Official repository for the ShapeFormer Project
Stars: ✭ 97 (-43.93%)
Mutual labels:  generative-model, representation-learning
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+141.62%)
Mutual labels:  generative-model, representation-learning
Kate
Code & data accompanying the KDD 2017 paper "KATE: K-Competitive Autoencoder for Text"
Stars: ✭ 135 (-21.97%)
Mutual labels:  representation-learning
Stylegan2 Pytorch
Simplest working implementation of Stylegan2, state of the art generative adversarial network, in Pytorch. Enabling everyone to experience disentanglement
Stars: ✭ 2,656 (+1435.26%)
Mutual labels:  generative-model
Hyte
EMNLP 2018: HyTE: Hyperplane-based Temporally aware Knowledge Graph Embedding
Stars: ✭ 130 (-24.86%)
Mutual labels:  representation-learning
Srl Zoo
State Representation Learning (SRL) zoo with PyTorch - Part of S-RL Toolbox
Stars: ✭ 125 (-27.75%)
Mutual labels:  representation-learning
Jodie
A PyTorch implementation of ACM SIGKDD 2019 paper "Predicting Dynamic Embedding Trajectory in Temporal Interaction Networks"
Stars: ✭ 172 (-0.58%)
Mutual labels:  representation-learning
Deformable Kernels
Deforming kernels to adapt towards object deformation. In ICLR 2020.
Stars: ✭ 166 (-4.05%)
Mutual labels:  representation-learning
Gretel Synthetics
Differentially private learning to create fake, synthetic datasets with enhanced privacy guarantees
Stars: ✭ 147 (-15.03%)
Mutual labels:  generative-model
Gesturegan
[ACM MM 2018 Oral] GestureGAN for Hand Gesture-to-Gesture Translation in the Wild
Stars: ✭ 136 (-21.39%)
Mutual labels:  generative-model
Disentangled Person Image Generation
Tensorflow implementation of CVPR 2018 paper "Disentangled Person Image Generation"
Stars: ✭ 158 (-8.67%)
Mutual labels:  generative-model
Role2vec
A scalable Gensim implementation of "Learning Role-based Graph Embeddings" (IJCAI 2018).
Stars: ✭ 134 (-22.54%)
Mutual labels:  representation-learning
Awesome Visual Representation Learning With Transformers
Awesome Transformers (self-attention) in Computer Vision
Stars: ✭ 166 (-4.05%)
Mutual labels:  representation-learning
First Order Model
This repository contains the source code for the paper First Order Motion Model for Image Animation
Stars: ✭ 11,964 (+6815.61%)
Mutual labels:  generative-model
Attribute Aware Attention
[ACM MM 2018] Attribute-Aware Attention Model for Fine-grained Representation Learning
Stars: ✭ 143 (-17.34%)
Mutual labels:  representation-learning
Mine Mutual Information Neural Estimation
A pytorch implementation of MINE(Mutual Information Neural Estimation)
Stars: ✭ 167 (-3.47%)
Mutual labels:  generative-model
Conditional Gan
Anime Generation
Stars: ✭ 141 (-18.5%)
Mutual labels:  generative-model

VAE with a VampPrior

This is a PyTorch implementation of a new prior ("Variational Mixture of Posteriors" prior, or VampPrior for short) for the variational auto-encoder framework with one layer and two layers of stochastic hidden units as described in the following paper:

  • Jakub M. Tomczak, Max Welling, VAE with a VampPrior, arXiv preprint, 2017

Requirements

The code is compatible with:

  • pytorch 0.2.0

Data

The experiments can be run on the following datasets:

  • static MNIST: links to the datasets can found at link;
  • binary MNIST: the dataset is loaded from PyTorch;
  • OMNIGLOT: the dataset could be downloaded from link;
  • Caltech 101 Silhouettes: the dataset could be downloaded from link.
  • Frey Faces: the dataset could be downloaded from link.
  • Histopathology Gray: the dataset could be downloaded from link;
  • CIFAR 10: the dataset is loaded from PyTorch.

Run the experiment

  1. Set-up your experiment in experiment.py.
  2. Run experiment:
python experiment.py

Models

You can run a vanilla VAE, a one-layered VAE or a two-layered HVAE with the standard prior or the VampPrior by setting model_name argument to either: (i) vae or hvae_2level for MLP, (ii) convvae_2level for convnets, (iii) pixelhvae_2level for (ii) with a PixelCNN-based decoder, and specifying prior argument to either standard or vampprior.

Citation

Please cite our paper if you use this code in your research:

@article{TW:2017,
  title={{VAE with a VampPrior}},
  author={Tomczak, Jakub M and Welling, Max},
  journal={arXiv},
  year={2017}
}

Acknowledgments

The research conducted by Jakub M. Tomczak was funded by the European Commission within the Marie Skłodowska-Curie Individual Fellowship (Grant No. 702666, ”Deep learning and Bayesian inference for medical imaging”).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].