All Projects → lzfelix → keras_attention

lzfelix / keras_attention

Licence: MIT License
🔖 An Attention Layer in Keras

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to keras attention

AoA-pytorch
A Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (-23.26%)
Mutual labels:  attention-mechanism
attention-mechanism-keras
attention mechanism in keras, like Dense and RNN...
Stars: ✭ 19 (-55.81%)
Mutual labels:  attention-mechanism
SelfAttentive
Implementation of A Structured Self-attentive Sentence Embedding
Stars: ✭ 107 (+148.84%)
Mutual labels:  attention-mechanism
Patient2Vec
Patient2Vec: A Personalized Interpretable Deep Representation of the Longitudinal Electronic Health Record
Stars: ✭ 85 (+97.67%)
Mutual labels:  attention-mechanism
FragmentVC
Any-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (+211.63%)
Mutual labels:  attention-mechanism
PAM
[TPAMI 2020] Parallax Attention for Unsupervised Stereo Correspondence Learning
Stars: ✭ 62 (+44.19%)
Mutual labels:  attention-mechanism
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (+0%)
Mutual labels:  attention-mechanism
CompareModels TRECQA
Compare six baseline deep learning models on TrecQA
Stars: ✭ 61 (+41.86%)
Mutual labels:  attention-mechanism
SentimentAnalysis
Sentiment Analysis: Deep Bi-LSTM+attention model
Stars: ✭ 32 (-25.58%)
Mutual labels:  attention-mechanism
Image-Caption
Using LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-16.28%)
Mutual labels:  attention-mechanism
NTUA-slp-nlp
💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-55.81%)
Mutual labels:  attention-mechanism
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+239.53%)
Mutual labels:  attention-mechanism
QuantumForest
Fast Differentiable Forest lib with the advantages of both decision trees and neural networks
Stars: ✭ 63 (+46.51%)
Mutual labels:  attention-mechanism
keras-deep-learning
Various implementations and projects on CNN, RNN, LSTM, GAN, etc
Stars: ✭ 22 (-48.84%)
Mutual labels:  attention-mechanism
Transformer-in-Transformer
An Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-6.98%)
Mutual labels:  attention-mechanism
ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (+83.72%)
Mutual labels:  attention-mechanism
Video-Cap
🎬 Video Captioning: ICCV '15 paper implementation
Stars: ✭ 44 (+2.33%)
Mutual labels:  attention-mechanism
MoChA-pytorch
PyTorch Implementation of "Monotonic Chunkwise Attention" (ICLR 2018)
Stars: ✭ 65 (+51.16%)
Mutual labels:  attention-mechanism
ttslearn
ttslearn: Library for Pythonで学ぶ音声合成 (Text-to-speech with Python)
Stars: ✭ 158 (+267.44%)
Mutual labels:  attention-mechanism
nuwa-pytorch
Implementation of NÜWA, state of the art attention network for text to video synthesis, in Pytorch
Stars: ✭ 347 (+706.98%)
Mutual labels:  attention-mechanism

Keras Attention Layer

Dead-simple Attention layer implementation in Keras based on the work of Yang et al. "Hierarchical Attention Networks for Document Classification"

Notice: the initial version of this repository was based on the implementation by Christos Baziotis. However, recently this repository was rewritten from scratch with the following features:

  • Compatibility with Keras 2.2 (tested with TensorFlow 1.8.0);
  • Annotations showing dimension transformations and equations;
  • Numerically stable softmax using the exp-normalize trick; (New!)
  • Easy way to recover the attention weights applied to each sample to make nice visualizations (see neat-vision); (Updated!)
  • Example showing differences between vanilla, attention model and attention with masking;
  • Example on the sum toy task showing how attention weights can be distributed across timesteps in a sample; (New!)
  • Example on sentiment analysis of movie reviews (but GitHub does not render notebook markup, you may want to download the notebook to see word highlights, as in the example below); (New!)
  • Allows customizing the attention activation function, since removing it might be beneficial for some tasks, as shown in "A Thorough Examination of the CNN/Daily Mail Reading Comprehension Task" by Chen et al. (New!)

Attention example on a movie review

Example of attention on words for sentiment classification in a movie review in the Keras IMDb dataset. Darker colors mean larger weights and, consequently, more importance is given to those term.

Attention example

Example of attention weights across timesteps during the classification of a sequential sample.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].