All Projects → swin-transformer-pytorch → Similar Projects or Alternatives

49 Open source projects that are alternatives of or similar to swin-transformer-pytorch

BA-Transformer
[MICCAI 2021] Boundary-aware Transformers for Skin Lesion Segmentation
Stars: ✭ 86 (-85.9%)
Mutual labels:  transformer-architecture
Sinet
Camouflaged Object Detection, CVPR 2020 (Oral & Reported by the New Scientist Magazine)
Stars: ✭ 246 (-59.67%)
Mutual labels:  attention-model
Attentionalpoolingaction
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (-59.34%)
Mutual labels:  attention-model
Generative Inpainting Pytorch
A PyTorch reimplementation for paper Generative Image Inpainting with Contextual Attention (https://arxiv.org/abs/1801.07892)
Stars: ✭ 242 (-60.33%)
Mutual labels:  attention-model
Pytorch Batch Attention Seq2seq
PyTorch implementation of batched bi-RNN encoder and attention-decoder.
Stars: ✭ 245 (-59.84%)
Mutual labels:  attention-model
Generative inpainting
DeepFill v1/v2 with Contextual Attention and Gated Convolution, CVPR 2018, and ICCV 2019 Oral
Stars: ✭ 2,659 (+335.9%)
Mutual labels:  attention-model
Keras Attention Mechanism
Attention mechanism Implementation for Keras.
Stars: ✭ 2,504 (+310.49%)
Mutual labels:  attention-model
Speech emotion recognition blstm
Bidirectional LSTM network for speech emotion recognition.
Stars: ✭ 203 (-66.72%)
Mutual labels:  attention-model
Snli Entailment
attention model for entailment on SNLI corpus implemented in Tensorflow and Keras
Stars: ✭ 181 (-70.33%)
Mutual labels:  attention-model
Pytorch Acnn Model
code of Relation Classification via Multi-Level Attention CNNs
Stars: ✭ 170 (-72.13%)
Mutual labels:  attention-model
Sa Tensorflow
Soft attention mechanism for video caption generation
Stars: ✭ 154 (-74.75%)
Mutual labels:  attention-model
Bamnet
Code & data accompanying the NAACL 2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"
Stars: ✭ 140 (-77.05%)
Mutual labels:  attention-model
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-79.34%)
Mutual labels:  attention-model
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-80.49%)
Mutual labels:  attention-model
Transformer image caption
Image Captioning based on Bottom-Up and Top-Down Attention model
Stars: ✭ 94 (-84.59%)
Mutual labels:  attention-model
Attention Gated Networks
Use of Attention Gates in a Convolutional Neural Network / Medical Image Classification and Segmentation
Stars: ✭ 1,237 (+102.79%)
Mutual labels:  attention-model
Code
ECG Classification
Stars: ✭ 78 (-87.21%)
Mutual labels:  attention-model
Pytorch Attention Guided Cyclegan
Pytorch implementation of Unsupervised Attention-guided Image-to-Image Translation.
Stars: ✭ 67 (-89.02%)
Mutual labels:  attention-model
Deepattention
Deep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (-89.34%)
Mutual labels:  attention-model
Awesome Attention Mechanism In Cv
计算机视觉中用到的注意力模块和其他即插即用模块PyTorch Implementation Collection of Attention Module and Plug&Play Module
Stars: ✭ 54 (-91.15%)
Mutual labels:  attention-model
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+62.3%)
Mutual labels:  attention-model
Reading comprehension tf
Machine Reading Comprehension in Tensorflow
Stars: ✭ 37 (-93.93%)
Mutual labels:  attention-model
Text Classification Pytorch
Text classification using deep learning models in Pytorch
Stars: ✭ 683 (+11.97%)
Mutual labels:  attention-model
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (-17.87%)
Mutual labels:  attention-model
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (-24.75%)
Mutual labels:  attention-model
Attention Ocr Chinese Version
Attention OCR Based On Tensorflow
Stars: ✭ 421 (-30.98%)
Mutual labels:  attention-model
Mtan
The implementation of "End-to-End Multi-Task Learning with Attention" [CVPR 2019].
Stars: ✭ 364 (-40.33%)
Mutual labels:  attention-model
Attentiongan
AttentionGAN for Unpaired Image-to-Image Translation & Multi-Domain Image-to-Image Translation
Stars: ✭ 341 (-44.1%)
Mutual labels:  attention-model
Attention ocr.pytorch
This repository implements the the encoder and decoder model with attention model for OCR
Stars: ✭ 278 (-54.43%)
Mutual labels:  attention-model
SAE-NAD
The implementation of "Point-of-Interest Recommendation: Exploiting Self-Attentive Autoencoders with Neighbor-Aware Influence"
Stars: ✭ 48 (-92.13%)
Mutual labels:  attention-model
Caver
Caver: a toolkit for multilabel text classification.
Stars: ✭ 38 (-93.77%)
Mutual labels:  attention-model
attention-mechanism-keras
attention mechanism in keras, like Dense and RNN...
Stars: ✭ 19 (-96.89%)
Mutual labels:  attention-model
PBAN-PyTorch
A Position-aware Bidirectional Attention Network for Aspect-level Sentiment Analysis, PyTorch implementation.
Stars: ✭ 33 (-94.59%)
Mutual labels:  attention-model
Attention
Repository for Attention Algorithm
Stars: ✭ 39 (-93.61%)
Mutual labels:  attention-model
Recognizing-Textual-Entailment
A pyTorch implementation of models used for Recognizing Textual Entailment using the SNLI corpus
Stars: ✭ 31 (-94.92%)
Mutual labels:  attention-model
HHH-An-Online-Question-Answering-System-for-Medical-Questions
HBAM: Hierarchical Bi-directional Word Attention Model
Stars: ✭ 44 (-92.79%)
Mutual labels:  attention-model
Compact-Global-Descriptor
Pytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-96.39%)
Mutual labels:  attention-model
learningspoons
nlp lecture-notes and source code
Stars: ✭ 29 (-95.25%)
Mutual labels:  attention-model
GATE
The implementation of "Gated Attentive-Autoencoder for Content-Aware Recommendation"
Stars: ✭ 65 (-89.34%)
Mutual labels:  attention-model
SANET
"Arbitrary Style Transfer with Style-Attentional Networks" (CVPR 2019)
Stars: ✭ 21 (-96.56%)
Mutual labels:  attention-model
reasoning attention
Unofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (-94.43%)
Mutual labels:  attention-model
G2P
Grapheme To Phoneme
Stars: ✭ 59 (-90.33%)
Mutual labels:  attention-model
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-96.23%)
Mutual labels:  transformer-architecture
pytorch-psetae
PyTorch implementation of the model presented in "Satellite Image Time Series Classification with Pixel-Set Encoders and Temporal Self-Attention"
Stars: ✭ 117 (-80.82%)
Mutual labels:  transformer-architecture
SeqFormer
SeqFormer: a Frustratingly Simple Model for Video Instance Segmentation
Stars: ✭ 230 (-62.3%)
Mutual labels:  transformer-architecture
SReT
Official PyTorch implementation of our ECCV 2022 paper "Sliced Recursive Transformer"
Stars: ✭ 51 (-91.64%)
Mutual labels:  transformer-architecture
mmwave-gesture-recognition
Basic Gesture Recognition Using mmWave Sensor - TI AWR1642
Stars: ✭ 32 (-94.75%)
Mutual labels:  transformer-architecture
ChangeFormer
Official PyTorch implementation of our IGARSS'22 paper: A Transformer-Based Siamese Network for Change Detection
Stars: ✭ 220 (-63.93%)
Mutual labels:  transformer-architecture
pytorch-transformer-kor-eng
Transformer Implementation using PyTorch for Neural Machine Translation (Korean to English)
Stars: ✭ 40 (-93.44%)
Mutual labels:  transformer-pytorch
1-49 of 49 similar projects