All Projects → mjDelta → attention-mechanism-keras

mjDelta / attention-mechanism-keras

Licence: other
attention mechanism in keras, like Dense and RNN...

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to attention-mechanism-keras

Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+2536.84%)
Mutual labels:  attention-mechanism, attention-model
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (+126.32%)
Mutual labels:  attention-mechanism, attention-lstm
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+2315.79%)
Mutual labels:  attention-mechanism, attention-model
Deepattention
Deep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (+242.11%)
Mutual labels:  attention-mechanism, attention-model
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+563.16%)
Mutual labels:  attention-mechanism, attention-model
Attentionalpoolingaction
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (+1205.26%)
Mutual labels:  attention-mechanism, attention-model
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+5110.53%)
Mutual labels:  attention-mechanism, attention-model
Pytorch Attention Guided Cyclegan
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Stars: ✭ 67 (+252.63%)
Mutual labels:  attention-mechanism, attention-model
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (+526.32%)
Mutual labels:  attention-mechanism, attention-model
Keras Attention Mechanism
Attention mechanism Implementation for Keras.
Stars: ✭ 2,504 (+13078.95%)
Mutual labels:  attention-mechanism, attention-model
Compact-Global-Descriptor
Pytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (+15.79%)
Mutual labels:  attention-mechanism, attention-model
keras-deep-learning
Various implementations and projects on CNN, RNN, LSTM, GAN, etc
Stars: ✭ 22 (+15.79%)
Mutual labels:  attention-mechanism
SANET
Arbitrary Style Transfer with Style-Attentional Networks
Stars: ✭ 105 (+452.63%)
Mutual labels:  attention-mechanism
halonet-pytorch
Implementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones
Stars: ✭ 181 (+852.63%)
Mutual labels:  attention-mechanism
egfr-att
Drug effect prediction using neural network
Stars: ✭ 17 (-10.53%)
Mutual labels:  attention-mechanism
enformer-pytorch
Implementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (+668.42%)
Mutual labels:  attention-mechanism
AoA-pytorch
A Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (+73.68%)
Mutual labels:  attention-mechanism
Attention
Repository for Attention Algorithm
Stars: ✭ 39 (+105.26%)
Mutual labels:  attention-model
Multigrid-Neural-Architectures
Multigrid Neural Architecture
Stars: ✭ 28 (+47.37%)
Mutual labels:  attention-mechanism
nystrom-attention
Implementation of Nyström Self-attention, from the paper Nyströmformer
Stars: ✭ 83 (+336.84%)
Mutual labels:  attention-mechanism

attention-mechanism-keras

This repo implements the attention mechanism in keras.

To do list

  • Attention in Dense Network
  • Attention in RNN Network

Attention in Dense Network

Use attention mechanism to get which feature shuold be paid more attention. A Dense layer is used to get the probs.Here is the probs plot.
image

Attention in RNN Network

Use attention mechanism to get which timesteps shuold be paid more attention( of cause, you can switch to which features shuold be paid more attention).Here is the probs plot.
image

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].