All Projects → kefirski → contiguous-succotash

kefirski / contiguous-succotash

Licence: MIT License
Recurrent Variational Autoencoder with Dilated Convolutions that generates sequential data implemented in pytorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to contiguous-succotash

Keras-Generating-Sentences-from-a-Continuous-Space
Text Variational Autoencoder inspired by the paper 'Generating Sentences from a Continuous Space' Bowman et al. https://arxiv.org/abs/1511.06349
Stars: ✭ 32 (-54.93%)
Mutual labels:  vae
Carla-ppo
This repository hosts a customized PPO based agent for Carla. The goal of this project is to make it easier to interact with and experiment in Carla with reinforcement learning based agents -- this, by wrapping Carla in a gym like environment that can handle custom reward functions, custom debug output, etc.
Stars: ✭ 122 (+71.83%)
Mutual labels:  vae
Parallel-Tacotron2
PyTorch Implementation of Google's Parallel Tacotron 2: A Non-Autoregressive Neural TTS Model with Differentiable Duration Modeling
Stars: ✭ 149 (+109.86%)
Mutual labels:  vae
Generative-Model
Repository for implementation of generative models with Tensorflow 1.x
Stars: ✭ 66 (-7.04%)
Mutual labels:  vae
VAENAR-TTS
PyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.
Stars: ✭ 66 (-7.04%)
Mutual labels:  vae
style-vae
Implementation of VAE and Style-GAN Architecture Achieving State of the Art Reconstruction
Stars: ✭ 25 (-64.79%)
Mutual labels:  vae
vae captioning
Implementation of Diverse and Accurate Image Description Using a Variational Auto-Encoder with an Additive Gaussian Encoding Space
Stars: ✭ 58 (-18.31%)
Mutual labels:  vae
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-7.04%)
Mutual labels:  vae
Pytorch-RL-CPP
A Repository with C++ implementations of Reinforcement Learning Algorithms (Pytorch)
Stars: ✭ 73 (+2.82%)
Mutual labels:  vae
char-VAE
Inspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-74.65%)
Mutual labels:  vae
nvae
An unofficial toy implementation for NVAE 《A Deep Hierarchical Variational Autoencoder》
Stars: ✭ 83 (+16.9%)
Mutual labels:  vae
vqvae-2
PyTorch implementation of VQ-VAE-2 from "Generating Diverse High-Fidelity Images with VQ-VAE-2"
Stars: ✭ 65 (-8.45%)
Mutual labels:  vae
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-69.01%)
Mutual labels:  vae
probabilistic nlg
Tensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-60.56%)
Mutual labels:  vae
variational-autoencoder-theano
Variational Autoencoders (VAEs) in Theano for Images and Text
Stars: ✭ 54 (-23.94%)
Mutual labels:  vae
pyroVED
Invariant representation learning from imaging and spectral data
Stars: ✭ 23 (-67.61%)
Mutual labels:  vae
BERT4Rec-VAE-Pytorch
Pytorch implementation of BERT4Rec and Netflix VAE.
Stars: ✭ 212 (+198.59%)
Mutual labels:  vae
sqair
Implementation of Sequential Attend, Infer, Repeat (SQAIR)
Stars: ✭ 96 (+35.21%)
Mutual labels:  vae
generative deep learning
Generative Deep Learning Sessions led by Anugraha Sinha (Machine Learning Tokyo)
Stars: ✭ 24 (-66.2%)
Mutual labels:  vae
TensorMONK
A collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (-70.42%)
Mutual labels:  vae

Pytorch Recurrent Variational Autoencoder with Dilated Convolutions

Model:

This is the implementation of Zichao Yang's Improved Variational Autoencoders for Text Modeling using Dilated Convolutions with Kim's Character-Aware Neural Language Models embedding for tokens

model_image

Most of the implementations about the recurrent variational autoencoder are adapted from analvikingur/pytorch_RVAE

Usage

Before model training it is necessary to train word embeddings:

$ python train_word_embeddings.py

This script train word embeddings defined in Mikolov et al. Distributed Representations of Words and Phrases

Parameters:

--use-cuda

--num-iterations

--batch-size

--num-sample –– number of sampled from noise tokens

To train model use:

$ python train.py

Parameters:

--use-cuda

--num-iterations

--batch-size

--learning-rate

--dropout –– probability of units to be zeroed in decoder input

--use-trained –– use trained before model

To sample data after training use:

$ python sample.py

Parameters:

--use-cuda

--num-sample

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].