Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-10%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-15%)
Attention Gated NetworksUse of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
Stars: ✭ 1,237 (+783.57%)
CodeECG Classification
Stars: ✭ 78 (-44.29%)
DeepattentionDeep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (-53.57%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+607.14%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+257.86%)
MtanThe implementation of "End-to-End Multi-Task Learning with Attention" [CVPR 2019].
Stars: ✭ 364 (+160%)
AttentionganAttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Stars: ✭ 341 (+143.57%)
Attention ocr.pytorchThis repository implements the the encoder and decoder model with attention model for OCR
Stars: ✭ 278 (+98.57%)
SAE-NADThe implementation of "Point-of-Interest Recommendation: Exploiting Self-Attentive Autoencoders with Neighbor-Aware Influence"
Stars: ✭ 48 (-65.71%)
CaverCaver: a toolkit for multilabel text classification.
Stars: ✭ 38 (-72.86%)
PBAN-PyTorchA Position-aware Bidirectional Attention Network for Aspect-level Sentiment Analysis, PyTorch implementation.
Stars: ✭ 33 (-76.43%)
AttentionRepository for Attention Algorithm
Stars: ✭ 39 (-72.14%)
Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-84.29%)
GATEThe implementation of "Gated Attentive-Autoencoder for Content-Aware Recommendation"
Stars: ✭ 65 (-53.57%)
SANET"Arbitrary Style Transfer with Style-Attentional Networks" (CVPR 2019)
Stars: ✭ 21 (-85%)
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-75.71%)
G2PGrapheme To Phoneme
Stars: ✭ 59 (-57.86%)
SinetCamouflaged Object Detection, CVPR 2020 (Oral & Reported by the New Scientist Magazine)
Stars: ✭ 246 (+75.71%)
AttentionalpoolingactionCode/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (+77.14%)
Generative Inpainting PytorchA PyTorch reimplementation for paper Generative Image Inpainting with Contextual Attention (https://arxiv.org/abs/1801.07892)
Stars: ✭ 242 (+72.86%)
Generative inpaintingDeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
Stars: ✭ 2,659 (+1799.29%)
Snli Entailmentattention model for entailment on SNLI corpus implemented in Tensorflow and Keras
Stars: ✭ 181 (+29.29%)
Pytorch Acnn Modelcode of Relation Classification via Multi-Level Attention CNNs
Stars: ✭ 170 (+21.43%)
Sa TensorflowSoft attention mechanism for video caption generation
Stars: ✭ 154 (+10%)