All Projects → lironui → Linear-Attention-Mechanism

lironui / Linear-Attention-Mechanism

Licence: AGPL-3.0 license
Attention mechanism

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Linear-Attention-Mechanism

Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (+466.67%)
Mutual labels:  attention, segmentation
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (+22.22%)
Mutual labels:  attention, attention-mechanism
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (+537.04%)
Mutual labels:  attention, attention-mechanism
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+366.67%)
Mutual labels:  attention, attention-mechanism
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+674.07%)
Mutual labels:  attention, attention-mechanism
Prediction Flow
Deep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (+411.11%)
Mutual labels:  attention, attention-mechanism
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (+588.89%)
Mutual labels:  attention, attention-mechanism
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (+137.04%)
Mutual labels:  attention, attention-mechanism
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+688.89%)
Mutual labels:  attention, attention-mechanism
Guided Attention Inference Network
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (+655.56%)
Mutual labels:  attention, attention-mechanism
Absa keras
Keras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (+366.67%)
Mutual labels:  attention, attention-mechanism
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-40.74%)
Mutual labels:  attention, attention-mechanism
Lambda Networks
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+5444.44%)
Mutual labels:  attention, attention-mechanism
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (+111.11%)
Mutual labels:  attention, attention-mechanism
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (+203.7%)
Mutual labels:  attention, attention-mechanism
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+581.48%)
Mutual labels:  attention, attention-mechanism
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+3262.96%)
Mutual labels:  attention, attention-mechanism
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-22.22%)
Mutual labels:  attention, attention-mechanism
Hnatt
Train and visualize Hierarchical Attention Networks
Stars: ✭ 192 (+611.11%)
Mutual labels:  attention, attention-mechanism
lstm-attention
Attention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+222.22%)
Mutual labels:  attention, attention-mechanism

Linear-Attention-Mechanism

Welcome to my HomePage

This repository implementatesLinear-Attention-Mechanism based on PyTorch.

The detailed formula can be seen in the Linear Attention Mechanism: An Efficient Attention for Semantic Segmentation or Multi-stage Attention ResU-Net for Semantic Segmentation of Fine-Resolution Remote Sensing Images.

If our code is helpful to you, please cite

  1. Li R, Jianlin Su, Duan C and Zheng S. Linear Attention Mechanism: An Efficient Attention for Semantic Segmentation[J]. arXiv preprint arXiv:2007.14902, 2020.
  2. 苏剑林. (2020, Jul 04). 《线性Attention的探索:Attention必须有个Softmax吗? 》[Blog post]. Retrieved from https://spaces.ac.cn/archives/7546
  3. R. Li, S. Zheng, C. Duan, J. Su and C. Zhang. "Multistage Attention ResU-Net for Semantic Segmentation of Fine-Resolution Remote Sensing Images." in IEEE Geoscience and Remote Sensing Letters, doi: 10.1109/LGRS.2021.3063381.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].