All Projects → shyamupa → Snli Entailment

shyamupa / Snli Entailment

attention model for entailment on SNLI corpus implemented in Tensorflow and Keras

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Snli Entailment

Attentiongan
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Stars: ✭ 341 (+88.4%)
Mutual labels:  attention-model
Awesome Attention Mechanism In Cv
计算机视觉中用到的注意力模块和其他即插即用模块PyTorch Implementation Collection of Attention Module and Plug&Play Module
Stars: ✭ 54 (-70.17%)
Mutual labels:  attention-model
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-34.25%)
Mutual labels:  attention-model
Attention Ocr Chinese Version
Attention OCR Based On Tensorflow
Stars: ✭ 421 (+132.6%)
Mutual labels:  attention-model
Reading comprehension tf
Machine Reading Comprehension in Tensorflow
Stars: ✭ 37 (-79.56%)
Mutual labels:  attention-model
Pytorch Attention Guided Cyclegan
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Stars: ✭ 67 (-62.98%)
Mutual labels:  attention-model
SAE-NAD
The implementation of "Point-of-Interest Recommendation: Exploiting Self-Attentive Autoencoders with Neighbor-Aware Influence"
Stars: ✭ 48 (-73.48%)
Mutual labels:  attention-model
Sa Tensorflow
Soft attention mechanism for video caption generation
Stars: ✭ 154 (-14.92%)
Mutual labels:  attention-model
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+446.96%)
Mutual labels:  attention-model
Transformer image caption
Image Captioning based on Bottom-Up and Top-Down Attention model
Stars: ✭ 94 (-48.07%)
Mutual labels:  attention-model
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+153.59%)
Mutual labels:  attention-model
Text Classification Pytorch
Text classification using deep learning models in Pytorch
Stars: ✭ 683 (+277.35%)
Mutual labels:  attention-model
Code
ECG Classification
Stars: ✭ 78 (-56.91%)
Mutual labels:  attention-model
Mtan
The implementation of "End-to-End Multi-Task Learning with Attention" [CVPR 2019].
Stars: ✭ 364 (+101.1%)
Mutual labels:  attention-model
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-30.39%)
Mutual labels:  attention-model
Attention ocr.pytorch
This repository implements the the encoder and decoder model with attention model for OCR
Stars: ✭ 278 (+53.59%)
Mutual labels:  attention-model
Deepattention
Deep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (-64.09%)
Mutual labels:  attention-model
Pytorch Acnn Model
code of Relation Classification via Multi-Level Attention CNNs
Stars: ✭ 170 (-6.08%)
Mutual labels:  attention-model
Bamnet
Code & data accompanying the NAACL 2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"
Stars: ✭ 140 (-22.65%)
Mutual labels:  attention-model
Attention Gated Networks
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
Stars: ✭ 1,237 (+583.43%)
Mutual labels:  attention-model

Implementations of a attention model for entailment from this paper in keras and tensorflow.

Compatible with keras v1.0.6 and tensorflow 0.11.0rc2

I implemented the model to learn the APIs for keras and tensorflow, so I have not really tuned on the performance. The models implemented in keras is a little different, as keras does not expose a method to set a LSTMs state.

To train,

  • Download snli dataset.
  • Create train, dev, test files with tab separated text, hypothesis and label (example file train10.txt). You can find some snippet in reader.py for this, if you are lazy.
  • Train by either running,
python amodel.py -train <TRAIN> -dev <DEV> -test <TEST>

for using the keras implementation, or

python tf_model.py -train <TRAIN> -dev <DEV> -test <TEST>

for using the tensorflow implementation. Look at the get_params() method in both scripts to see how to specify different parameters.

Log is written out in *.log file with callback for accuracy.

For comments, improvements, bug-reports and suggestions for tuning, email [email protected]

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].