All Projects → nicola-decao → S Vae Tf

nicola-decao / S Vae Tf

Licence: mit
Tensorflow implementation of Hyperspherical Variational Auto-Encoders

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to S Vae Tf

Pytorch Vae
A CNN Variational Autoencoder (CNN-VAE) implemented in PyTorch
Stars: ✭ 181 (-8.59%)
Mutual labels:  vae, variational-autoencoder
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+307.58%)
Mutual labels:  vae, variational-autoencoder
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+111.11%)
Mutual labels:  vae, variational-autoencoder
Pytorch Rl
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (+98.99%)
Mutual labels:  vae, variational-autoencoder
Python World
Stars: ✭ 98 (-50.51%)
Mutual labels:  vae, variational-autoencoder
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (+101.01%)
Mutual labels:  vae, variational-autoencoder
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-32.32%)
Mutual labels:  vae, variational-autoencoder
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-71.72%)
Mutual labels:  vae, variational-autoencoder
Vae For Image Generation
Implemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (-56.06%)
Mutual labels:  vae, variational-autoencoder
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (-71.21%)
Mutual labels:  vae, variational-autoencoder
Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+1811.62%)
Mutual labels:  vae, variational-autoencoder
Mojitalk
Code for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (-45.96%)
Mutual labels:  vae, variational-autoencoder
S Vae Pytorch
Pytorch implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 255 (+28.79%)
Mutual labels:  vae, variational-autoencoder
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-29.8%)
Mutual labels:  vae, variational-autoencoder
classifying-vae-lstm
music generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (-86.36%)
Mutual labels:  vae, variational-autoencoder
Tensorflow Mnist Vae
Tensorflow implementation of variational auto-encoder for MNIST
Stars: ✭ 422 (+113.13%)
Mutual labels:  vae, variational-autoencoder
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-88.89%)
Mutual labels:  vae, variational-autoencoder
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-66.67%)
Mutual labels:  vae, variational-autoencoder
Variational Autoencoder
PyTorch implementation of "Auto-Encoding Variational Bayes"
Stars: ✭ 25 (-87.37%)
Mutual labels:  vae, variational-autoencoder
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (-48.48%)
Mutual labels:  vae, variational-autoencoder

Hyperspherical Variational Auto-Encoders

Tensorflow implementation of Hyperspherical Variational Auto-Encoders

Overview

This library contains a Tensorflow implementation of the hyperspherical variational auto-encoder, or S-VAE, as presented in [1](http://arxiv.org/abs/1804.00891). Check also our blogpost (https://nicola-decao.github.io/s-vae).

  • You do not use Tensorflow? Take a look here for a pytorch implementation!

Dependencies

Installation

To install, run

$ python setup.py install

Structure

  • distributions: Tensorflow implementation of the von Mises-Fisher and hyperspherical Uniform distributions. Both inherit from tf.distributions.Distribution.
  • ops: Low-level operations used for computing the exponentially scaled modified Bessel function of the first kind and its derivative.
  • examples: Example code for using the library within a Tensorflow project.

Usage

Please have a look into the examples folder. We adapted our implementation to follow the structure of the recently proposed Tensorflow Distribution library, (Dillon et al, 2017).

Please cite [1] in your work when using this library in your experiments.

Sampling von Mises-Fisher

To sample the von Mises-Fisher distribution we follow the rejection sampling procedure as outlined by Ulrich, 1984. This simulation pipeline is visualised below:

blog toy1

Note that as is a scalar, this approach does not suffer from the curse of dimensionality. For the final transformation, , a Householder reflection is utilized.

Feedback

For questions and comments, feel free to contact Nicola De Cao.

License

MIT

Citation

[1] Davidson, T. R., Falorsi, L., De Cao, N., Kipf, T.,
and Tomczak, J. M. (2018). Hyperspherical Variational
Auto-Encoders. 34th Conference on Uncertainty in Artificial Intelligence (UAI-18).

BibTeX format:

@article{s-vae18,
  title={Hyperspherical Variational Auto-Encoders},
  author={Davidson, Tim R. and
          Falorsi, Luca and
          De Cao, Nicola and
          Kipf, Thomas and
          Tomczak, Jakub M.},
  journal={34th Conference on Uncertainty in Artificial Intelligence (UAI-18)},
  year={2018}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].