All Projects → riannevdberg → Sylvester Flows

riannevdberg / Sylvester Flows

Licence: mit

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Sylvester Flows

Dfc Vae
Variational Autoencoder trained by Feature Perceputal Loss
Stars: ✭ 74 (-51.32%)
Mutual labels:  vae
Numpy Ml
Machine learning, in numpy
Stars: ✭ 11,100 (+7202.63%)
Mutual labels:  vae
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-8.55%)
Mutual labels:  vae
Vae For Image Generation
Implemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (-42.76%)
Mutual labels:  vae
Cross Lingual Voice Cloning
Tacotron 2 - PyTorch implementation with faster-than-realtime inference modified to enable cross lingual voice cloning.
Stars: ✭ 106 (-30.26%)
Mutual labels:  vae
O Gan
O-GAN: Extremely Concise Approach for Auto-Encoding Generative Adversarial Networks
Stars: ✭ 117 (-23.03%)
Mutual labels:  vae
Elliot
Comprehensive and Rigorous Framework for Reproducible Recommender Systems Evaluation
Stars: ✭ 49 (-67.76%)
Mutual labels:  vae
Beat Blender
Blend beats using machine learning to create music in a fun new way.
Stars: ✭ 147 (-3.29%)
Mutual labels:  vae
Mojitalk
Code for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (-29.61%)
Mutual labels:  vae
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-11.84%)
Mutual labels:  vae
Generative Models
Comparison of Generative Models in Tensorflow
Stars: ✭ 96 (-36.84%)
Mutual labels:  vae
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (-32.89%)
Mutual labels:  vae
Vae Tensorflow
A Tensorflow implementation of a Variational Autoencoder for the deep learning course at the University of Southern California (USC).
Stars: ✭ 117 (-23.03%)
Mutual labels:  vae
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-46.05%)
Mutual labels:  vae
Vmf vae nlp
Code for EMNLP18 paper "Spherical Latent Spaces for Stable Variational Autoencoders"
Stars: ✭ 140 (-7.89%)
Mutual labels:  vae
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (-62.5%)
Mutual labels:  vae
Pytorch cpp
Deep Learning sample programs using PyTorch in C++
Stars: ✭ 114 (-25%)
Mutual labels:  vae
Pytorch Vae
A Collection of Variational Autoencoders (VAE) in PyTorch.
Stars: ✭ 2,704 (+1678.95%)
Mutual labels:  vae
Vae Seq
Variational Auto-Encoders in a Sequential Setting.
Stars: ✭ 145 (-4.61%)
Mutual labels:  vae
Srl Zoo
State Representation Learning (SRL) zoo with PyTorch - Part of S-RL Toolbox
Stars: ✭ 125 (-17.76%)
Mutual labels:  vae

Sylvester normalizing flows for variational inference

Pytorch implementation of Sylvester normalizing flows, based on our paper:

Sylvester normalizing flows for variational inference (UAI 2018)
Rianne van den Berg*, Leonard Hasenclever*, Jakub Tomczak, Max Welling

*Equal contribution

Requirements

The latest release of the code is compatible with:

  • pytorch 1.0.0

  • python 3.7

Thanks to Martin Engelcke for adapting the code to provide this compatibility.

Version v0.3.0_2.7 is compatible with:

  • pytorch 0.3.0 WARNING: More recent versions of pytorch have different default flags for the binary cross entropy loss module: nn.BCELoss(). You have to adapt the appropriate flags if you want to port this code to a later vers
    ion.

  • python 2.7

Data

The experiments can be run on the following datasets:

  • static MNIST: dataset is in data folder;
  • OMNIGLOT: the dataset can be downloaded from link;
  • Caltech 101 Silhouettes: the dataset can be downloaded from link.
  • Frey Faces: the dataset can be downloaded from link.

Usage

Below, example commands are given for running experiments on static MNIST with different types of Sylvester normalizing flows, for 4 flows:

Orthogonal Sylvester flows
This example uses a bottleneck of size 8 (Q has 8 columns containing orthonormal vectors).

python main_experiment.py -d mnist -nf 4 --flow orthogonal --num_ortho_vecs 8 

Householder Sylvester flows
This example uses 8 Householder reflections per orthogonal matrix Q.

python main_experiment.py -d mnist -nf 4 --flow householder --num_householder 8

Triangular Sylvester flows

python main_experiment.py -d mnist -nf 4 --flow triangular 

To run an experiment with other types of normalizing flows or just with a factorized Gaussian posterior, see below.


Factorized Gaussian posterior

python main_experiment.py -d mnist --flow no_flow

Planar flows

python main_experiment.py -d mnist -nf 4 --flow planar

Inverse Autoregressive flows
This examples uses MADEs with 320 hidden units.

python main_experiment.py -d mnist -nf 4 --flow iaf --made_h_size 320

More information about additional argument options can be found by running ```python main_experiment.py -h```

Cite

Please cite our paper if you use this code in your own work:

@inproceedings{vdberg2018sylvester,
  title={Sylvester normalizing flows for variational inference},
  author={van den Berg, Rianne and Hasenclever, Leonard and Tomczak, Jakub and Welling, Max},
  booktitle={proceedings of the Conference on Uncertainty in Artificial Intelligence (UAI)},
  year={2018}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].