Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+294.34%)
Mutual labels: attention-mechanism
TianChi AIEarthTianChi AIEarth Contest Solution
Stars: ✭ 57 (+7.55%)
Mutual labels: attention-mechanism
Neural-ChatbotA Neural Network based Chatbot
Stars: ✭ 68 (+28.3%)
Mutual labels: attention-mechanism
AoanetCode for paper "Attention on Attention for Image Captioning". ICCV 2019
Stars: ✭ 242 (+356.6%)
Mutual labels: attention-mechanism
Transformers-RLAn easy PyTorch implementation of "Stabilizing Transformers for Reinforcement Learning"
Stars: ✭ 107 (+101.89%)
Mutual labels: attention-mechanism
SA-DLSentiment Analysis with Deep Learning models. Implemented with Tensorflow and Keras.
Stars: ✭ 35 (-33.96%)
Mutual labels: attention-mechanism
X TransformersA simple but complete full-attention transformer with a set of promising experimental features from various papers
Stars: ✭ 211 (+298.11%)
Mutual labels: attention-mechanism
NARREThis is our implementation of NARRE:Neural Attentional Regression with Review-level Explanations
Stars: ✭ 100 (+88.68%)
Mutual labels: attention-mechanism
question-generationNeural Models for Key Phrase Detection and Question Generation
Stars: ✭ 29 (-45.28%)
Mutual labels: attention-mechanism
STAM-pytorchImplementation of STAM (Space Time Attention Model), a pure and simple attention model that reaches SOTA for video classification
Stars: ✭ 109 (+105.66%)
Mutual labels: attention-mechanism
AttentionalpoolingactionCode/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (+367.92%)
Mutual labels: attention-mechanism
DARNNA Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction
Stars: ✭ 90 (+69.81%)
Mutual labels: attention-mechanism
Linformer PytorchMy take on a practical implementation of Linformer for Pytorch.
Stars: ✭ 239 (+350.94%)
Mutual labels: attention-mechanism
memory-compressed-attentionImplementation of Memory-Compressed Attention, from the paper "Generating Wikipedia By Summarizing Long Sequences"
Stars: ✭ 47 (-11.32%)
Mutual labels: attention-mechanism
Triplet AttentionOfficial PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Stars: ✭ 222 (+318.87%)
Mutual labels: attention-mechanism
Im2LaTeXAn implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-69.81%)
Mutual labels: attention-mechanism
Optic-Disc-UnetAttention Unet model with post process for retina optic disc segmention
Stars: ✭ 77 (+45.28%)
Mutual labels: attention-mechanism
Visual-Attention-ModelChainer implementation of Deepmind's Visual Attention Model paper
Stars: ✭ 27 (-49.06%)
Mutual labels: attention-mechanism
amta-netAsymmetric Multi-Task Attention Network for Prostate Bed Segmentation in CT Images
Stars: ✭ 26 (-50.94%)
Mutual labels: attention-mechanism