All Projects → alexeyev → Keras-Generating-Sentences-from-a-Continuous-Space

alexeyev / Keras-Generating-Sentences-from-a-Continuous-Space

Licence: MIT license
Text Variational Autoencoder inspired by the paper 'Generating Sentences from a Continuous Space' Bowman et al. https://arxiv.org/abs/1511.06349

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Keras-Generating-Sentences-from-a-Continuous-Space

Bagel
IPCCC 2018: Robust and Unsupervised KPI Anomaly Detection Based on Conditional Variational Autoencoder
Stars: ✭ 45 (+40.63%)
Mutual labels:  vae, variational-autoencoder
Textbox
TextBox is an open-source library for building text generation system.
Stars: ✭ 257 (+703.13%)
Mutual labels:  text-generation, variational-autoencoder
Vae Cvae Mnist
Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch
Stars: ✭ 229 (+615.63%)
Mutual labels:  vae, variational-autoencoder
S Vae Tf
Tensorflow implementation of Hyperspherical Variational Auto-Encoders
Stars: ✭ 198 (+518.75%)
Mutual labels:  vae, variational-autoencoder
benchmark VAE
Unifying Variational Autoencoder (VAE) implementations in Pytorch (NeurIPS 2022)
Stars: ✭ 1,211 (+3684.38%)
Mutual labels:  vae, variational-autoencoder
Cada Vae Pytorch
Official implementation of the paper "Generalized Zero- and Few-Shot Learning via Aligned Variational Autoencoders" (CVPR 2019)
Stars: ✭ 198 (+518.75%)
Mutual labels:  vae, variational-autoencoder
precision-recall-distributions
Assessing Generative Models via Precision and Recall (official repository)
Stars: ✭ 80 (+150%)
Mutual labels:  vae, variational-autoencoder
Vae Tensorflow
A Tensorflow implementation of a Variational Autoencoder for the deep learning course at the University of Southern California (USC).
Stars: ✭ 117 (+265.63%)
Mutual labels:  vae, variational-autoencoder
soft-intro-vae-pytorch
[CVPR 2021 Oral] Official PyTorch implementation of Soft-IntroVAE from the paper "Soft-IntroVAE: Analyzing and Improving Introspective Variational Autoencoders"
Stars: ✭ 170 (+431.25%)
Mutual labels:  vae, variational-autoencoder
MIDI-VAE
No description or website provided.
Stars: ✭ 56 (+75%)
Mutual labels:  vae, variational-autoencoder
Pytorch Vae
A CNN Variational Autoencoder (CNN-VAE) implemented in PyTorch
Stars: ✭ 181 (+465.63%)
Mutual labels:  vae, variational-autoencoder
Variational-Autoencoder-pytorch
Implementation of a convolutional Variational-Autoencoder model in pytorch.
Stars: ✭ 65 (+103.13%)
Mutual labels:  vae, variational-autoencoder
Tensorflow Mnist Cvae
Tensorflow implementation of conditional variational auto-encoder for MNIST
Stars: ✭ 139 (+334.38%)
Mutual labels:  vae, variational-autoencoder
Variational Recurrent Autoencoder Tensorflow
A tensorflow implementation of "Generating Sentences from a Continuous Space"
Stars: ✭ 228 (+612.5%)
Mutual labels:  vae, variational-autoencoder
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+318.75%)
Mutual labels:  vae, variational-autoencoder
Video prediction
Stochastic Adversarial Video Prediction
Stars: ✭ 247 (+671.88%)
Mutual labels:  vae, variational-autoencoder
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (+218.75%)
Mutual labels:  vae, variational-autoencoder
Mojitalk
Code for "MojiTalk: Generating Emotional Responses at Scale" https://arxiv.org/abs/1711.04090
Stars: ✭ 107 (+234.38%)
Mutual labels:  vae, variational-autoencoder
Vae Lagging Encoder
PyTorch implementation of "Lagging Inference Networks and Posterior Collapse in Variational Autoencoders" (ICLR 2019)
Stars: ✭ 153 (+378.13%)
Mutual labels:  text-generation, vae
vae-concrete
Keras implementation of a Variational Auto Encoder with a Concrete Latent Distribution
Stars: ✭ 51 (+59.38%)
Mutual labels:  vae, variational-autoencoder

Generating Sentences from a Continuous Space

Keras implementation of LSTM variational autoencoder based on the code in twairball's repo. Totally rewritten. Doesn't follow the paper exactly, but the main ideas are implemented.

Quick start

Updated: this code was written a while ago. So now probably the best way to run the script is using environments (I am assuming that anaconda is installed and that you are a Linux or WSL user, however, Mac/Windows instructions should be similar).

conda create -y --name continuous_space python=3.6 && conda activate continuous_space
wget http://d2l-data.s3-accelerate.amazonaws.com/fra-eng.zip && \
        unzip fra-eng.zip && mv fra-eng/fra.txt data/ && rm -r fra-eng* 
conda install -y tensorflow==1.13.1
conda install -y keras==2.2.4
conda install -c anaconda nltk==3.4.5
python -m nltk.downloader punkt

(this may take a while!)

Then run e.g.

python train.py --input data/fra.txt --epochs 20

References

License

MIT

TODO

  • Dropout and other tricks from the paper
  • Initialization with word2vec/GloVE/whatever using the Embedding layer and its weights matrix

Citation

Please do not forget to cite the original paper if you use the implemented method:

@inproceedings{bowman2016generating,
  title={Generating sentences from a continuous space},
  author={Bowman, Samuel R and Vilnis, Luke and Vinyals, Oriol and Dai, Andrew M and Jozefowicz, Rafal and Bengio, Samy},
  booktitle={20th SIGNLL Conference on Computational Natural Language Learning, CoNLL 2016},
  pages={10--21},
  year={2016},
  organization={Association for Computational Linguistics (ACL)}
}

Citing this repo is not necessary, but is greatly appreciated, if you use this work.

@misc{Alekseev2018lstmvaekeras,
  author = {Alekseev~A.M.},
  title = {Generating Sentences from a Continuous Space, Keras implementation.},
  year = {2018},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/alexeyev/Keras-Generating-Sentences-from-a-Continuous-Space}},
  commit = {the latest commit of the codebase you have used}
}

Examples

Travelling the space:

    1000 samples, 40 epochs, toy example: train data
    ==  	 i 'm lucky . 	 	 ==
    1.00	 i 'm lucky 
    0.83	 i 'm lucky 
    0.67	 i 'm tough 
    0.50	 i 'm well 
    0.33	 i won . 
    0.17	 go calm 
    0.00	 slow down 
    ==  	 slow down . 	 	 	 ==
    
    3000 samples, 40 epochs, toy example: train data
    ==  	 it was long . 	 	 	 ==
    1.00	 it was long 
    0.83	 it was long 
    0.67	 it was new 
    0.50	 it was new 
    0.33	 it was wrong 
    0.17	 is that 
    0.00	 is that 
    ==  	 is that so ? 	 	 	 ==
    
    ==  	 i was ready . 	 	 	 ==
    1.00	 i was ready 
    0.83	 i was ready 
    0.67	 do n't die 
    0.50	 do n't die 
    0.33	 do n't lie 
    0.17	 he is here 
    0.00	 he is here 
    ==  	 he is here ! 	 	 	 ==
    
    ==  	 i feel cold . 	 	 	 ==
    1.00	 i feel cold 
    0.83	 i feel cold 
    0.67	 i feel . 
    0.50	 feel this 
    0.33	 bring wine 
    0.17	 say goodbye 
    0.00	 say goodbye 
    ==  	 say goodbye . 	 	 	 	 ==
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].