Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-99.12%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-95.25%)
DeepattentionDeep Visual Attention Prediction (TIP18)
Stars: ✭ 65 (-97.4%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (-79.99%)
AttentionalpoolingactionCode/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Stars: ✭ 248 (-90.1%)
Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-94.97%)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (-60.46%)
Prediction FlowDeep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-94.49%)
BamnetCode & data accompanying the NAACL 2019 paper "Bidirectional Attentive Memory Networks for Question Answering over Knowledge Bases"
Stars: ✭ 140 (-94.41%)
Pytorch Acnn Modelcode of Relation Classification via Multi-Level Attention CNNs
Stars: ✭ 170 (-93.21%)
HnattTrain and visualize Hierarchical Attention Networks
Stars: ✭ 192 (-92.33%)
AdnetAttention-guided CNN for image denoising(Neural Networks,2020)
Stars: ✭ 135 (-94.61%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-93.41%)
Perceiver PytorchImplementation of Perceiver, General Perception with Iterative Attention, in Pytorch
Stars: ✭ 130 (-94.81%)
Abstractive SummarizationImplementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (-94.89%)
GatGraph Attention Networks (https://arxiv.org/abs/1710.10903)
Stars: ✭ 2,229 (-10.98%)
Yolov3 Point从零开始学习YOLOv3教程解读代码+注意力模块(SE,SPP,RFB etc)
Stars: ✭ 119 (-95.25%)
Graph attention poolAttention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (-92.57%)
Picanet ImplementationPytorch Implementation of PiCANet: Learning Pixel-wise Contextual Attention for Saliency Detection
Stars: ✭ 157 (-93.73%)
PygatPytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Stars: ✭ 1,853 (-26%)
Attribute Aware Attention[ACM MM 2018] Attribute-Aware Attention Model for Fine-grained Representation Learning
Stars: ✭ 143 (-94.29%)
Document Classifier LstmA bidirectional LSTM with attention for multiclass/multilabel text classification.
Stars: ✭ 136 (-94.57%)
InvoicenetDeep neural network to extract intelligent information from invoice documents.
Stars: ✭ 1,886 (-24.68%)
Lstm attentionattention-based LSTM/Dense implemented by Keras
Stars: ✭ 168 (-93.29%)
Slot AttentionImplementation of Slot Attention from GoogleAI
Stars: ✭ 168 (-93.29%)
Sa TensorflowSoft attention mechanism for video caption generation
Stars: ✭ 154 (-93.85%)
Overlappredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-95.77%)
Absa kerasKeras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (-94.97%)
Attentive Gan DerainnetUnofficial tensorflow implemention of "Attentive Generative Adversarial Network for Raindrop Removal from A Single Image (CVPR 2018) " model https://maybeshewill-cv.github.io/attentive-gan-derainnet/
Stars: ✭ 184 (-92.65%)
DrlnDensely Residual Laplacian Super-resolution, IEEE Pattern Analysis and Machine Intelligence (TPAMI), 2020
Stars: ✭ 120 (-95.21%)
Guided Attention Inference NetworkContains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (-91.85%)
GeomanTensorflow Implement of GeoMAN, IJCAI-18
Stars: ✭ 113 (-95.49%)
Sinkhorn TransformerSinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Stars: ✭ 156 (-93.77%)
Text recognition toolboxtext_recognition_toolbox: The reimplementation of a series of classical scene text recognition papers with Pytorch in a uniform way.
Stars: ✭ 114 (-95.45%)
Datastories Semeval2017 Task4Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (-92.65%)
Stanetofficial implementation of the spatial-temporal attention neural network (STANet) for remote sensing image change detection
Stars: ✭ 109 (-95.65%)
Ylg[CVPR 2020] Official Implementation: "Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models".
Stars: ✭ 109 (-95.65%)
Lambda NetworksImplementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (-40.22%)
Csa InpaintingCoherent Semantic Attention for image inpainting(ICCV 2019)
Stars: ✭ 202 (-91.93%)
Pan[Params: Only 272K!!!] Efficient Image Super-Resolution Using Pixel Attention, in ECCV Workshop, 2020.
Stars: ✭ 151 (-93.97%)
Reformer PytorchReformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (-34.35%)
Dhf1kRevisiting Video Saliency: A Large-scale Benchmark and a New Model (CVPR18, PAMI19)
Stars: ✭ 96 (-96.17%)
HartHierarchical Attentive Recurrent Tracking
Stars: ✭ 149 (-94.05%)
Kac NetImplementation of Knowledge Aided Consistency for Weakly Supervised Phrase Grounding in Tensorflow
Stars: ✭ 95 (-96.21%)
EqtransformerEQTransformer, a python package for earthquake signal detection and phase picking using AI.
Stars: ✭ 95 (-96.21%)
Snli Entailmentattention model for entailment on SNLI corpus implemented in Tensorflow and Keras
Stars: ✭ 181 (-92.77%)