SinetCamouflaged Object Detection, CVPR 2020 (Oral & Reported by the New Scientist Magazine)
Generative Inpainting PytorchA PyTorch reimplementation for paper Generative Image Inpainting with Contextual Attention (https://arxiv.org/abs/1801.07892)
Generative inpaintingDeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
Snli Entailmentattention model for entailment on SNLI corpus implemented in Tensorflow and Keras
Sa TensorflowSoft attention mechanism for video caption generation
BamnetCode & data accompanying the NAACL 2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Attention Gated NetworksUse of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Nmt KerasNeural Machine Translation with Keras
MtanThe implementation of "End-to-End Multi-Task Learning with Attention" [CVPR 2019].
AttentionganAttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Attention ocr.pytorchThis repository implements the the encoder and decoder model with attention model for OCR
SAE-NADThe implementation of "Point-of-Interest Recommendation: Exploiting Self-Attentive Autoencoders with Neighbor-Aware Influence"
CaverCaver: a toolkit for multilabel text classification.
PBAN-PyTorchA Position-aware Bidirectional Attention Network for Aspect-level Sentiment Analysis, PyTorch implementation.
GATEThe implementation of "Gated Attentive-Autoencoder for Content-Aware Recommendation"
SANET"Arbitrary Style Transfer with Style-Attentional Networks" (CVPR 2019)