All Projects → gentaiscool → lstm-attention

gentaiscool / lstm-attention

Licence: other
Attention-based bidirectional LSTM for Classification Task (ICASSP)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to lstm-attention

ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-9.2%)
Mutual labels:  attention, attention-mechanism, emotion-recognition
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-26.44%)
Mutual labels:  attention, attention-mechanism
Hnatt
Train and visualize Hierarchical Attention Networks
Stars: ✭ 192 (+120.69%)
Mutual labels:  attention, attention-mechanism
Absa keras
Keras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (+44.83%)
Mutual labels:  attention, attention-mechanism
Guided Attention Inference Network
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (+134.48%)
Mutual labels:  attention, attention-mechanism
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+943.68%)
Mutual labels:  attention, attention-mechanism
Lambda Networks
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+1620.69%)
Mutual labels:  attention, attention-mechanism
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+368.97%)
Mutual labels:  attention, attention-mechanism
Prediction Flow
Deep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (+58.62%)
Mutual labels:  attention, attention-mechanism
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+140.23%)
Mutual labels:  attention, attention-mechanism
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (+97.7%)
Mutual labels:  attention, attention-mechanism
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+527.59%)
Mutual labels:  attention, attention-mechanism
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+427.59%)
Mutual labels:  attention, attention-mechanism
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-75.86%)
Mutual labels:  attention, attention-mechanism
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+372.41%)
Mutual labels:  attention, attention-mechanism
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-5.75%)
Mutual labels:  attention, attention-mechanism
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+144.83%)
Mutual labels:  attention, attention-mechanism
Attention
一些不同的Attention机制代码
Stars: ✭ 17 (-80.46%)
Mutual labels:  attention, attention-mechanism
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+251.72%)
Mutual labels:  attention, attention-mechanism
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+44.83%)
Mutual labels:  attention, attention-mechanism

LSTM with Attention by using Context Vector for Classification task

The implementation of Attention-Based LSTM for Psychological Stress Detection from Spoken Language Using Distant Supervision paper. The idea is to consider the importance of every word from the inputs and use it in the classification. Then the importance scores are normalized through the softmax layer. The weighted sum of the score and hidden states in every time-step is used for the classification.

If you are using the code or data in your work, please cite the following (ICASSP 2018 Proceeding)

@INPROCEEDINGS{8461990,
author={G. I. Winata and O. P. Kampman and P. Fung},
booktitle={2018 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
title={Attention-Based LSTM for Psychological Stress Detection from Spoken Language Using Distant Supervision},
year={2018},
volume={},
number={},
pages={6204-6208},
doi={10.1109/ICASSP.2018.8461990},
ISSN={2379-190X},
month={April},}

Data

Please find stress datasets in the data/ directory. The interview dataset is saved in the csv format and the tweet dataset are stored in the npy format.

Architecture

The architecture of the model is illustrated by the following

You can easily get the attention weights from the model and visualize them

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].