All Projects → cheng6076 → LSTM-Autoencoder

cheng6076 / LSTM-Autoencoder

Licence: other
Seq2Seq LSTM Autoencoder

Programming Languages

lua
6591 projects

LSTM-Autoencoder

This project implements the LSTM Autoencoder for sequence modeling. The model reads a sequence and decodes itself. The model can be easily extended for any encoder-decoder task.

Dependencies

This code requires Torch7 and nngraph

Datasets

In general, with proper parameter settings the model can recover 80%-90% of the words, when tested on a small subset of Toronto movie book corpus[http://www.cs.toronto.edu/~mbweb/].

Usage

To train a model with default setting, simply run th LSTMAutoencoder.lua The code generates samples at validation time, to inspect the effective of reconstruction. One may consider to use the Autoencoder to obtain general purpose sentence vectors, or as a pretraining step for downstream tasks

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].