All Projects → nicola-decao → S Vae Pytorch

nicola-decao / S Vae Pytorch

Licence: mit
Pytorch implementation of Hyperspherical Variational Auto-Encoders

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to S Vae Pytorch

soft-intro-vae-pytorch
[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (-33.33%)
Mutual labels:  vae, variational-autoencoder
Variational-Autoencoder-pytorch
Implementation of a convolutional Variational-Autoencoder model in pytorch.
Stars: ✭ 65 (-74.51%)
Mutual labels:  vae, variational-autoencoder
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+374.9%)
Mutual labels:  vae, variational-autoencoder
Vae Cvae Mnist
Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch
Stars: ✭ 229 (-10.2%)
Mutual labels:  vae, variational-autoencoder
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-78.04%)
Mutual labels:  vae, variational-autoencoder
Video prediction
Stochastic Adversarial Video Prediction
Stars: ✭ 247 (-3.14%)
Mutual labels:  vae, variational-autoencoder
precision-recall-distributions
Assessing Generative Models via Precision and Recall (official repository)
Stars: ✭ 80 (-68.63%)
Mutual labels:  vae, variational-autoencoder
Pytorch Vae
A CNN Variational Autoencoder (CNN-VAE) implemented in PyTorch
Stars: ✭ 181 (-29.02%)
Mutual labels:  vae, variational-autoencoder
Keras-Generating-Sentences-from-a-Continuous-Space
Text Variational Autoencoder inspired by the paper 'Generating Sentences from a Continuous Space' Bowman et al. https://arxiv.org/abs/1511.06349
Stars: ✭ 32 (-87.45%)
Mutual labels:  vae, variational-autoencoder
pyroVED
Invariant representation learning from imaging and spectral data
Stars: ✭ 23 (-90.98%)
Mutual labels:  vae, variational-autoencoder
Variational Recurrent Autoencoder Tensorflow
A tensorflow implementation of "Generating Sentences from a Continuous Space"
Stars: ✭ 228 (-10.59%)
Mutual labels:  vae, variational-autoencoder
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-74.12%)
Mutual labels:  vae, variational-autoencoder
Cada Vae Pytorch
Official implementation of the paper "Generalized Zero- and Few-Shot Learning via Aligned Variational Autoencoders" (CVPR 2019)
Stars: ✭ 198 (-22.35%)
Mutual labels:  vae, variational-autoencoder
MIDI-VAE
No description or website provided.
Stars: ✭ 56 (-78.04%)
Mutual labels:  vae, variational-autoencoder
S Vae Tf
Tensorflow implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 198 (-22.35%)
Mutual labels:  vae, variational-autoencoder
vae-concrete
Keras implementation of a Variational Auto Encoder with a Concrete Latent Distribution
Stars: ✭ 51 (-80%)
Mutual labels:  vae, variational-autoencoder
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-47.45%)
Mutual labels:  vae, variational-autoencoder
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (-45.49%)
Mutual labels:  vae, variational-autoencoder
Bagel
IPCCC 2018: Robust and Unsupervised KPI Anomaly Detection Based on Conditional Variational Autoencoder
Stars: ✭ 45 (-82.35%)
Mutual labels:  vae, variational-autoencoder
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-91.37%)
Mutual labels:  vae, variational-autoencoder

Hyperspherical Variational Auto-Encoders

Pytorch implementation of Hyperspherical Variational Auto-Encoders

Overview

This library contains a Pytorch implementation of the hyperspherical variational auto-encoder, or S-VAE, as presented in [1](http://arxiv.org/abs/1804.00891). Check also our blogpost (https://nicola-decao.github.io/s-vae).

  • Don't use Pytorch? Take a look here for a tensorflow implementation!

Dependencies

Installation

To install, run

$ python setup.py install

Structure

  • distributions: Pytorch implementation of the von Mises-Fisher and hyperspherical Uniform distributions. Both inherit from torch.distributions.Distribution.
  • ops: Low-level operations used for computing the exponentially scaled modified Bessel function of the first kind and its derivative.
  • examples: Example code for using the library within a PyTorch project.

Usage

Please have a look into the examples folder. We adapted our implementation to follow the structure of the Pytorch probability distributions.

Please cite [1] in your work when using this library in your experiments.

Sampling von Mises-Fisher

To sample the von Mises-Fisher distribution we follow the rejection sampling procedure as outlined by Ulrich, 1984. This simulation pipeline is visualized below:

blog toy1

Note that as is a scalar, this approach does not suffer from the curse of dimensionality. For the final transformation, , a Householder reflection is utilized.

Feedback

For questions and comments, feel free to contact Nicola De Cao or Tim Davidson.

License

MIT

Citation

[1] Davidson, T. R., Falorsi, L., De Cao, N., Kipf, T.,
and Tomczak, J. M. (2018). Hyperspherical Variational
Auto-Encoders. 34th Conference on Uncertainty in Artificial Intelligence (UAI-18).

BibTeX format:

@article{s-vae18,
  title={Hyperspherical Variational Auto-Encoders},
  author={Davidson, Tim R. and
          Falorsi, Luca and
          De Cao, Nicola and
          Kipf, Thomas and
          Tomczak, Jakub M.},
  journal={34th Conference on Uncertainty in Artificial Intelligence (UAI-18)},
  year={2018}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].