All Categories → Machine Learning → attention-mechanism

Top 223 attention-mechanism open source projects

Attentionalpoolingaction
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Aoanet
Code for paper "Attention on Attention for Image Captioning". ICCV 2019
Linformer Pytorch
My take on a practical implementation of Linformer for Pytorch.
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Triplet Attention
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
X Transformers
A simple but complete full-attention transformer with a set of promising experimental features from various papers
Dalle Pytorch
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Lightnetplusplus
LightNet++: Boosted Light-weighted Networks for Real-time Semantic Segmentation
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Linear Attention Transformer
Transformer based on a variant of attention that is linear complexity in respect to sequence length
Guided Attention Inference Network
Contains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Attention Mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Point Transformer Pytorch
Implementation of the Point Transformer layer, in Pytorch
Csa Inpainting
Coherent Semantic Attention for image inpainting(ICCV 2019)
Sca Cnn.cvpr17
Image Captions Generation with Spatial and Channel-wise Attention
Hnatt
Train and visualize Hierarchical Attention Networks
Attentive Gan Derainnet
Unofficial tensorflow implemention of "Attentive Generative Adversarial Network for Raindrop Removal from A Single Image (CVPR 2018) " model https://maybeshewill-cv.github.io/attentive-gan-derainnet/
Graph attention pool
Attention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
A Pytorch Tutorial To Text Classification
Hierarchical Attention Networks | a PyTorch Tutorial to Text Classification
Slot filling intent joint model
attention based joint model for intent detection and slot filling
Lstm attention
attention-based LSTM/Dense implemented by Keras
Eeg Dl
A Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Gat
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Picanet Implementation
Pytorch Implementation of PiCANet: Learning Pixel-wise Contextual Attention for Saliency Detection
Sinkhorn Transformer
Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Pan
[Params: Only 272K!!!] Efficient Image Super-Resolution Using Pixel Attention, in ECCV Workshop, 2020.
Hart
Hierarchical Attentive Recurrent Tracking
Awesome Speech Recognition Speech Synthesis Papers
Automatic Speech Recognition (ASR), Speaker Verification, Speech Synthesis, Text-to-Speech (TTS), Language Modelling, Singing Voice Synthesis (SVS), Voice Conversion (VC)
Seq2seq chatbot new
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Attribute Aware Attention
[ACM MM 2018] Attribute-Aware Attention Model for Fine-grained Representation Learning
Document Classifier Lstm
A bidirectional LSTM with attention for multiclass/multilabel text classification.
Hierarchical Multi Label Text Classification
The code of CIKM'19 paper《Hierarchical Multi-label Text Classification: An Attention-based Recurrent Network Approach》
Perceiver Pytorch
Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Abstractive Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Absa keras
Keras Implementation of Aspect based Sentiment Analysis
Transformer In Generating Dialogue
An Implementation of 'Attention is all you need' with Chinese Corpus
Drln
Densely Residual Laplacian Super-resolution, IEEE Pattern Analysis and Machine Intelligence (TPAMI), 2020
Yolov3 Point
从零开始学习YOLOv3教程解读代码+注意力模块(SE,SPP,RFB etc)
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Geoman
Tensorflow Implement of GeoMAN, IJCAI-18
Pygat
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Text recognition toolbox
text_recognition_toolbox: The reimplementation of a series of classical scene text recognition papers with Pytorch in a uniform way.
Overlappredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stanet
official implementation of the spatial-temporal attention neural network (STANet) for remote sensing image change detection
Ylg
[CVPR 2020] Official Implementation: "Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models".
Lambda Networks
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Dhf1k
Revisiting Video Saliency: A Large-scale Benchmark and a New Model (CVPR18, PAMI19)
1-60 of 223 attention-mechanism projects