BA-Transformer[MICCAI 2021] Boundary-aware Transformers for Skin Lesion Segmentation
Stars: ✭ 86 (-85.9%)
SinetCamouflaged Object Detection, CVPR 2020 (Oral & Reported by the New Scientist Magazine)
Stars: ✭ 246 (-59.67%)
AttentionalpoolingactionCode/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (-59.34%)
Generative Inpainting PytorchA PyTorch reimplementation for paper Generative Image Inpainting with Contextual Attention (https://arxiv.org/abs/1801.07892)
Stars: ✭ 242 (-60.33%)
Generative inpaintingDeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
Stars: ✭ 2,659 (+335.9%)
Snli Entailmentattention model for entailment on SNLI corpus implemented in Tensorflow and Keras
Stars: ✭ 181 (-70.33%)
Pytorch Acnn Modelcode of Relation Classification via Multi-Level Attention CNNs
Stars: ✭ 170 (-72.13%)
Sa TensorflowSoft attention mechanism for video caption generation
Stars: ✭ 154 (-74.75%)
BamnetCode & data accompanying the NAACL 2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"
Stars: ✭ 140 (-77.05%)
Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-79.34%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-80.49%)
Attention Gated NetworksUse of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
Stars: ✭ 1,237 (+102.79%)
CodeECG Classification
Stars: ✭ 78 (-87.21%)
DeepattentionDeep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (-89.34%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+62.3%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (-17.87%)
MtanThe implementation of "End-to-End Multi-Task Learning with Attention" [CVPR 2019].
Stars: ✭ 364 (-40.33%)
AttentionganAttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Stars: ✭ 341 (-44.1%)
Attention ocr.pytorchThis repository implements the the encoder and decoder model with attention model for OCR
Stars: ✭ 278 (-54.43%)
SAE-NADThe implementation of "Point-of-Interest Recommendation: Exploiting Self-Attentive Autoencoders with Neighbor-Aware Influence"
Stars: ✭ 48 (-92.13%)
CaverCaver: a toolkit for multilabel text classification.
Stars: ✭ 38 (-93.77%)
PBAN-PyTorchA Position-aware Bidirectional Attention Network for Aspect-level Sentiment Analysis, PyTorch implementation.
Stars: ✭ 33 (-94.59%)
AttentionRepository for Attention Algorithm
Stars: ✭ 39 (-93.61%)
Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-96.39%)
GATEThe implementation of "Gated Attentive-Autoencoder for Content-Aware Recommendation"
Stars: ✭ 65 (-89.34%)
SANET"Arbitrary Style Transfer with Style-Attentional Networks" (CVPR 2019)
Stars: ✭ 21 (-96.56%)
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-94.43%)
G2PGrapheme To Phoneme
Stars: ✭ 59 (-90.33%)
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-96.23%)
pytorch-psetaePyTorch implementation of the model presented in "Satellite Image Time Series Classification with Pixel-Set Encoders and Temporal Self-Attention"
Stars: ✭ 117 (-80.82%)
SeqFormerSeqFormer: a Frustratingly Simple Model for Video Instance Segmentation
Stars: ✭ 230 (-62.3%)
SReTOfficial PyTorch implementation of our ECCV 2022 paper "Sliced Recursive Transformer"
Stars: ✭ 51 (-91.64%)
ChangeFormerOfficial PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (-63.93%)
pytorch-transformer-kor-engTransformer Implementation using PyTorch for Neural Machine Translation (Korean to English)
Stars: ✭ 40 (-93.44%)