All Projects → wuga214 → Implementation_variational Auto Encoder

wuga214 / Implementation_variational Auto Encoder

Licence: mit
Simple implementation of Variational Autoencoder

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Implementation variational Auto Encoder

Tensorflow Generative Model Collections
Collection of generative models in Tensorflow
Stars: ✭ 3,785 (+4572.84%)
Mutual labels:  variational-autoencoder
Neurec
Next RecSys Library
Stars: ✭ 731 (+802.47%)
Mutual labels:  variational-autoencoder
Li emnlp 2017
Deep Recurrent Generative Decoder for Abstractive Text Summarization in DyNet
Stars: ✭ 56 (-30.86%)
Mutual labels:  variational-autoencoder
Pytorch Rl
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (+386.42%)
Mutual labels:  variational-autoencoder
Scvi Tools
Deep probabilistic analysis of single-cell omics data
Stars: ✭ 452 (+458.02%)
Mutual labels:  variational-autoencoder
Variational Autoencoder
PyTorch implementation of "Auto-Encoding Variational Bayes"
Stars: ✭ 25 (-69.14%)
Mutual labels:  variational-autoencoder
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+240.74%)
Mutual labels:  variational-autoencoder
Codeslam
Implementation of CodeSLAM — Learning a Compact, Optimisable Representation for Dense Visual SLAM paper (https://arxiv.org/pdf/1804.00874.pdf)
Stars: ✭ 64 (-20.99%)
Mutual labels:  variational-autoencoder
Continual Learning
PyTorch implementation of various methods for continual learning (XdG, EWC, online EWC, SI, LwF, GR, GR+distill, RtF, ER, A-GEM, iCaRL).
Stars: ✭ 600 (+640.74%)
Mutual labels:  variational-autoencoder
Variationaldeepsemantichashing
The original implementation of the models and experiments of Variational Deep Semantic Hashing paper (SIGIR 2017)
Stars: ✭ 50 (-38.27%)
Mutual labels:  variational-autoencoder
Disentangling Vae
Experiments for understanding disentanglement in VAE latent representations
Stars: ✭ 398 (+391.36%)
Mutual labels:  variational-autoencoder
Tensorflow Mnist Vae
Tensorflow implementation of variational auto-encoder for MNIST
Stars: ✭ 422 (+420.99%)
Mutual labels:  variational-autoencoder
Simple Variational Autoencoder
A VAE written entirely in Numpy/Cupy
Stars: ✭ 20 (-75.31%)
Mutual labels:  variational-autoencoder
Vae cf
Variational autoencoders for collaborative filtering
Stars: ✭ 386 (+376.54%)
Mutual labels:  variational-autoencoder
Vae protein function
Protein function prediction using a variational autoencoder
Stars: ✭ 57 (-29.63%)
Mutual labels:  variational-autoencoder
Neuraldialog Cvae
Tensorflow Implementation of Knowledge-Guided CVAE for dialog generation ACL 2017. It is released by Tiancheng Zhao (Tony) from Dialog Research Center, LTI, CMU
Stars: ✭ 279 (+244.44%)
Mutual labels:  variational-autoencoder
Variational Autoencoder
Variational autoencoder implemented in tensorflow and pytorch (including inverse autoregressive flow)
Stars: ✭ 807 (+896.3%)
Mutual labels:  variational-autoencoder
Bayesian Machine Learning
Notebooks about Bayesian methods for machine learning
Stars: ✭ 1,202 (+1383.95%)
Mutual labels:  variational-autoencoder
Repo 2017
Python codes in Machine Learning, NLP, Deep Learning and Reinforcement Learning with Keras and Theano
Stars: ✭ 1,123 (+1286.42%)
Mutual labels:  variational-autoencoder
Deep Generative Models
Deep generative models implemented with TensorFlow 2.0: eg. Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN), Deep Boltzmann Machine (DBM), Convolutional Variational Auto-Encoder (CVAE), Convolutional Generative Adversarial Network (CGAN)
Stars: ✭ 34 (-58.02%)
Mutual labels:  variational-autoencoder

Variational Autoencoder

This is a enhanced implementation of Variational Autoencoder. Both fully connected and convolutional encoder/decoder are built in this model. Please star if you like this implementation.

Use

$python vae_train_amine.py # for training
$python sample.py # for sampling

Update

  1. Removed standard derivation learning on Gaussian observation decoder.
  2. Set the standard derivation of observation to hyper-parameter.
  3. Add deconvolution CNN support for the Anime dataset.
  4. Remove Anime dataset itself to avoid legal issues.

Pre-Trained Models

There are two pretrained models

  1. Anime
  2. MNIST

The weights of pretrained models are locaded in weights folder

Samples

ANIME

MNIST

Latent Space Distribution

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].