cheng6076 / Snli Attention
SNLI with word-word attention by LSTM encoder-decoder
Stars: ✭ 235
Programming Languages
lua
6591 projects
SNLI task with LSTM Memory Network encoder-dencoder and neural attention
This is an implementation for the deep attention fusion LSTM memory network presented in the paper "Long Short-Term Memory Networks for Machine Reading".
Setup and Usage
This code requires Torch7 and nngraph. It is updated to use torch version around May 2016. Minimum preprocessing is needed to obtain a good accuracy, including lower-casing and tokenization.
Citation
@article{cheng2016,
author = {Cheng, Jianpeng and Dong, Li and Lapata, Mirella,
title = {Long Short-Term Memory Networks for Machine Reading},
journal = {EMNLP},
year = {2016},
pages = {551--562}
}
Note that the project description data, including the texts, logos, images, and/or trademarks,
for each open source project belongs to its rightful owner.
If you wish to add or remove any projects, please contact us at [email protected].