ntua-slp-semeval2018Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-9.2%)
Mutual labels: attention, attention-mechanism, emotion-recognition
Global Self Attention NetworkA Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-26.44%)
Mutual labels: attention, attention-mechanism
HnattTrain and visualize Hierarchical Attention Networks
Stars: ✭ 192 (+120.69%)
Mutual labels: attention, attention-mechanism
Absa kerasKeras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (+44.83%)
Mutual labels: attention, attention-mechanism
Guided Attention Inference NetworkContains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (+134.48%)
Mutual labels: attention, attention-mechanism
Pytorch GatMy implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+943.68%)
Mutual labels: attention, attention-mechanism
Lambda NetworksImplementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+1620.69%)
Mutual labels: attention, attention-mechanism
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+368.97%)
Mutual labels: attention, attention-mechanism
Prediction FlowDeep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (+58.62%)
Mutual labels: attention, attention-mechanism
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+140.23%)
Mutual labels: attention, attention-mechanism
Multimodal Sentiment AnalysisAttention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (+97.7%)
Mutual labels: attention, attention-mechanism
Performer PytorchAn implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+527.59%)
Mutual labels: attention, attention-mechanism
Structured Self AttentionA Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+427.59%)
Mutual labels: attention, attention-mechanism
Isab PytorchAn implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-75.86%)
Mutual labels: attention, attention-mechanism
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+372.41%)
Mutual labels: attention, attention-mechanism
Attend infer repeatA Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-5.75%)
Mutual labels: attention, attention-mechanism
Neat VisionNeat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+144.83%)
Mutual labels: attention, attention-mechanism
Attention一些不同的Attention机制代码
Stars: ✭ 17 (-80.46%)
Mutual labels: attention, attention-mechanism
Seq2seq SummarizerPointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+251.72%)
Mutual labels: attention, attention-mechanism
Image Caption GeneratorA neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (+44.83%)
Mutual labels: attention, attention-mechanism