Pytorch Acnn Modelcode of Relation Classification via Multi-Level Attention CNNs
Stars: ✭ 170 (-6.08%)
Sa TensorflowSoft attention mechanism for video caption generation
Stars: ✭ 154 (-14.92%)
BamnetCode & data accompanying the NAACL 2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"
Stars: ✭ 140 (-22.65%)
Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-30.39%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-34.25%)
Attention Gated NetworksUse of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
Stars: ✭ 1,237 (+583.43%)
CodeECG Classification
Stars: ✭ 78 (-56.91%)
DeepattentionDeep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (-64.09%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+446.96%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+176.8%)
MtanThe implementation of "End-to-End Multi-Task Learning with Attention" [CVPR 2019].
Stars: ✭ 364 (+101.1%)
AttentionganAttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Stars: ✭ 341 (+88.4%)
Attention ocr.pytorchThis repository implements the the encoder and decoder model with attention model for OCR
Stars: ✭ 278 (+53.59%)
SAE-NADThe implementation of "Point-of-Interest Recommendation: Exploiting Self-Attentive Autoencoders with Neighbor-Aware Influence"
Stars: ✭ 48 (-73.48%)
CaverCaver: a toolkit for multilabel text classification.
Stars: ✭ 38 (-79.01%)
PBAN-PyTorchA Position-aware Bidirectional Attention Network for Aspect-level Sentiment Analysis, PyTorch implementation.
Stars: ✭ 33 (-81.77%)
AttentionRepository for Attention Algorithm
Stars: ✭ 39 (-78.45%)
Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-87.85%)
GATEThe implementation of "Gated Attentive-Autoencoder for Content-Aware Recommendation"
Stars: ✭ 65 (-64.09%)
SANET"Arbitrary Style Transfer with Style-Attentional Networks" (CVPR 2019)
Stars: ✭ 21 (-88.4%)
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-81.22%)
G2PGrapheme To Phoneme
Stars: ✭ 59 (-67.4%)
SinetCamouflaged Object Detection, CVPR 2020 (Oral & Reported by the New Scientist Magazine)
Stars: ✭ 246 (+35.91%)
AttentionalpoolingactionCode/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (+37.02%)
Generative Inpainting PytorchA PyTorch reimplementation for paper Generative Image Inpainting with Contextual Attention (https://arxiv.org/abs/1801.07892)
Stars: ✭ 242 (+33.7%)
Generative inpaintingDeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
Stars: ✭ 2,659 (+1369.06%)