All Projects → yanghanxy → CIAN

yanghanxy / CIAN

Licence: MIT license
Implementation of the Character-level Intra Attention Network (CIAN) for Natural Language Inference (NLI) upon SNLI and MultiNLI corpus

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to CIAN

consistency
Implementation of models in our EMNLP 2019 paper: A Logic-Driven Framework for Consistency of Neural Models
Stars: ✭ 26 (+52.94%)
Mutual labels:  snli, mnli
organic-chemistry-reaction-prediction-using-NMT
organic chemistry reaction prediction using NMT with Attention
Stars: ✭ 30 (+76.47%)
Mutual labels:  attention-mechanism
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-5.88%)
Mutual labels:  attention-mechanism
bisemantic
Text pair classification
Stars: ✭ 12 (-29.41%)
Mutual labels:  snli
Video-Description-with-Spatial-Temporal-Attention
[ACM MM 2017 & IEEE TMM 2020] This is the Theano code for the paper "Video Description with Spatial Temporal Attention"
Stars: ✭ 53 (+211.76%)
Mutual labels:  attention-mechanism
NARRE
This is our implementation of NARRE:Neural Attentional Regression with Review-level Explanations
Stars: ✭ 100 (+488.24%)
Mutual labels:  attention-mechanism
Character-enhanced-Sememe-Prediction
Code accompanying Incorporating Chinese Characters of Words for Lexical Sememe Prediction (ACL2018) https://arxiv.org/abs/1806.06349
Stars: ✭ 22 (+29.41%)
Mutual labels:  character-level
uniformer-pytorch
Implementation of Uniformer, a simple attention and 3d convolutional net that achieved SOTA in a number of video classification tasks, debuted in ICLR 2022
Stars: ✭ 90 (+429.41%)
Mutual labels:  attention-mechanism
LSTM-Attention
A Comparison of LSTMs and Attention Mechanisms for Forecasting Financial Time Series
Stars: ✭ 53 (+211.76%)
Mutual labels:  attention-mechanism
memory-compressed-attention
Implementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"
Stars: ✭ 47 (+176.47%)
Mutual labels:  attention-mechanism
Fill-the-GAP
[ACL-WS] 4th place solution to gendered pronoun resolution challenge on Kaggle
Stars: ✭ 13 (-23.53%)
Mutual labels:  natural-language-inference
amta-net
Asymmetric Multi-Task Attention Network for Prostate Bed Segmentation in CT Images
Stars: ✭ 26 (+52.94%)
Mutual labels:  attention-mechanism
reasoning attention
Unofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (+100%)
Mutual labels:  snli
SA-DL
Sentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Stars: ✭ 35 (+105.88%)
Mutual labels:  attention-mechanism
hexia
Mid-level PyTorch Based Framework for Visual Question Answering.
Stars: ✭ 24 (+41.18%)
Mutual labels:  attention-mechanism
TianChi AIEarth
TianChi AIEarth Contest Solution
Stars: ✭ 57 (+235.29%)
Mutual labels:  attention-mechanism
Neural-Chatbot
A Neural Network based Chatbot
Stars: ✭ 68 (+300%)
Mutual labels:  attention-mechanism
Visual-Attention-Model
Chainer implementation of Deepmind's Visual Attention Model paper
Stars: ✭ 27 (+58.82%)
Mutual labels:  attention-mechanism
dgcnn
Clean & Documented TF2 implementation of "An end-to-end deep learning architecture for graph classification" (M. Zhang et al., 2018).
Stars: ✭ 21 (+23.53%)
Mutual labels:  attention-mechanism
ChangeFormer
Official PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (+1194.12%)
Mutual labels:  attention-mechanism

Character-level Intra Attention Network

Implementation of the paper Character-level Intra Attention Network (CIAN) in proceddings of the RepEval Workshop in The 2017 Conference on Empirical Methods on Natural Language Processing.

Architecture of the model:

Requirements

Code is written in python 2.7 and requires Keras 2.

Data

Dataset could be downloaded at MultiNLI and SNLI.

How to run

First to do a modification with Keras, see the following section.

Dataset should be be put in folder ./data

To run the model, use

python ./model.py

The result and log file will be saved in ./log folder.

Modification with keras

In #Python_Path\Lib\site-packages\keras\preprocessing\text.py, line 39,

CHANGE

    text = text.translate(maketrans(filters, split * len(filters)))

TO

    try:
        text = unicode(text, "utf-8")
    except TypeError:
        pass
    translate_table = {ord(c): ord(t) for c, t in zip(filters, split * len(filters))}
    text = text.translate(translate_table)

Result

Visualization of Attention

PairID 192997e, label Entailment

PairID 254941e, label Entailment

Reference

[1] Character-Aware Neural Language Models

[2] Intra Attention Mechanism

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].