All Projects → kefirski → Pytorch_rvae

kefirski / Pytorch_rvae

Licence: mit
Recurrent Variational Autoencoder that generates sequential data implemented with pytorch

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pytorch rvae

continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-93.37%)
Mutual labels:  vae
contiguous-succotash
Recurrent Variational Autoencoder with Dilated Convolutions that generates sequential data implemented in pytorch
Stars: ✭ 71 (-78.61%)
Mutual labels:  vae
Copycat-abstractive-opinion-summarizer
ACL 2020 Unsupervised Opinion Summarization as Copycat-Review Generation
Stars: ✭ 76 (-77.11%)
Mutual labels:  vae
char-VAE
Inspired by the neural style algorithm in the computer vision field, we propose a high-level language model with the aim of adapting the linguistic style.
Stars: ✭ 18 (-94.58%)
Mutual labels:  vae
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (-80.12%)
Mutual labels:  vae
DiffuseVAE
A combination of VAE's and Diffusion Models for efficient, controllable and high-fidelity generation from low-dimensional latents
Stars: ✭ 81 (-75.6%)
Mutual labels:  vae
BERT4Rec-VAE-Pytorch
Pytorch implementation of BERT4Rec and Netflix VAE.
Stars: ✭ 212 (-36.14%)
Mutual labels:  vae
Daisyrec
A developing recommender system in pytorch. Algorithm: KNN, LFM, SLIM, NeuMF, FM, DeepFM, VAE and so on, which aims to fair comparison for recommender system benchmarks
Stars: ✭ 280 (-15.66%)
Mutual labels:  vae
sqair
Implementation of Sequential Attend, Infer, Repeat (SQAIR)
Stars: ✭ 96 (-71.08%)
Mutual labels:  vae
srVAE
VAE with RealNVP prior and Super-Resolution VAE in PyTorch. Code release for https://arxiv.org/abs/2006.05218.
Stars: ✭ 56 (-83.13%)
Mutual labels:  vae
Parallel-Tacotron2
PyTorch Implementation of Google's Parallel Tacotron 2: A Non-Autoregressive Neural TTS Model with Differentiable Duration Modeling
Stars: ✭ 149 (-55.12%)
Mutual labels:  vae
generative deep learning
Generative Deep Learning Sessions led by Anugraha Sinha (Machine Learning Tokyo)
Stars: ✭ 24 (-92.77%)
Mutual labels:  vae
ladder-vae-pytorch
Ladder Variational Autoencoders (LVAE) in PyTorch
Stars: ✭ 59 (-82.23%)
Mutual labels:  vae
TensorMONK
A collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (-93.67%)
Mutual labels:  vae
classifying-vae-lstm
music generation with a classifying variational autoencoder (VAE) and LSTM
Stars: ✭ 27 (-91.87%)
Mutual labels:  vae
style-vae
Implementation of VAE and Style-GAN Architecture Achieving State of the Art Reconstruction
Stars: ✭ 25 (-92.47%)
Mutual labels:  vae
learning-to-drive-in-5-minutes
Implementation of reinforcement learning approach to make a car learn to drive smoothly in minutes
Stars: ✭ 227 (-31.63%)
Mutual labels:  vae
Beta Vae
Pytorch implementation of β-VAE
Stars: ✭ 326 (-1.81%)
Mutual labels:  vae
S Vae Pytorch
Pytorch implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 255 (-23.19%)
Mutual labels:  vae
disent
🧶 Modular VAE disentanglement framework for python built with PyTorch Lightning ▸ Including metrics and datasets ▸ With strongly supervised, weakly supervised and unsupervised methods ▸ Easily configured and run with Hydra config ▸ Inspired by disentanglement_lib
Stars: ✭ 41 (-87.65%)
Mutual labels:  vae

Pytorch Recurrent Variational Autoencoder

Model:

This is the implementation of Samuel Bowman's Generating Sentences from a Continuous Space with Kim's Character-Aware Neural Language Models embedding for tokens

Sampling examples:

the new machine could be used to increase the number of ventures block in the company 's <unk> shopping system to finance diversified organizations

u.s. government officials also said they would be willing to consider whether the proposal could be used as urging and programs

men believe they had to go on the <unk> because their <unk> were <unk> expensive important

the companies insisted that the color set could be included in the program

Usage

Before model training it is necessary to train word embeddings:

$ python train_word_embeddings.py

This script train word embeddings defined in Mikolov et al. Distributed Representations of Words and Phrases

Parameters:

--use-cuda

--num-iterations

--batch-size

--num-sample –– number of sampled from noise tokens

To train model use:

$ python train.py

Parameters:

--use-cuda

--num-iterations

--batch-size

--learning-rate

--dropout –– probability of units to be zeroed in decoder input

--use-trained –– use trained before model

To sample data after training use:

$ python sample.py

Parameters:

--use-cuda

--num-sample

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].