Ai lawall kinds of baseline models for long text classificaiton( text categorization)
Stars: ✭ 243 (+1178.95%)
Doc Han AttHierarchical Attention Networks for Chinese Sentiment Classification
Stars: ✭ 206 (+984.21%)
bert attn vizVisualize BERT's self-attention layers on text classification tasks
Stars: ✭ 41 (+115.79%)
MGANExploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (+131.58%)
DeepLearningReadingDeep Learning and Machine Learning mini-projects. Current Project: Deepmind Attentive Reader (rc-data)
Stars: ✭ 78 (+310.53%)
Self Attention CvImplementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+1000%)
EBIM-NLIEnhanced BiLSTM Inference Model for Natural Language Inference
Stars: ✭ 24 (+26.32%)
Self Attentive TensorflowTensorflow implementation of "A Structured Self-Attentive Sentence Embedding"
Stars: ✭ 189 (+894.74%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (+115.79%)
Astgcn⚠️[Deprecated] no longer maintained, please use the code in https://github.com/guoshnBJTU/ASTGCN-r-pytorch
Stars: ✭ 246 (+1194.74%)
free-lunch-saliencyCode for "Free-Lunch Saliency via Attention in Atari Agents"
Stars: ✭ 15 (-21.05%)
reasoning attentionUnofficial implementation algorithms of attention models on SNLI dataset
Stars: ✭ 34 (+78.95%)
Neat VisionNeat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+1021.05%)
GraphtransformerGraph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.
Stars: ✭ 187 (+884.21%)
keras-utility-layer-collectionCollection of custom layers and utility functions for Keras which are missing in the main framework.
Stars: ✭ 63 (+231.58%)
Datastories Semeval2017 Task4Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (+868.42%)
External-Attention-pytorch🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
Stars: ✭ 7,344 (+38552.63%)
AttentionnAll about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Stars: ✭ 175 (+821.05%)
dreyeve[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (+363.16%)
Hey JetsonDeep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (+747.37%)
RecycleNetAttentional Learning of Trash Classification
Stars: ✭ 23 (+21.05%)
lstm-attentionAttention-based bidirectional LSTM for Classification Task (ICASSP)
Stars: ✭ 87 (+357.89%)
DeepMoveCodes for WWW'18 Paper-DeepMove: Predicting Human Mobility with Attentional Recurrent Network
Stars: ✭ 120 (+531.58%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+17889.47%)
Long Range ArenaLong Range Arena for Benchmarking Efficient Transformers
Stars: ✭ 235 (+1136.84%)
flow1d[ICCV 2021 Oral] High-Resolution Optical Flow from 1D Attention and Correlation
Stars: ✭ 91 (+378.95%)
AppnpA PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019).
Stars: ✭ 234 (+1131.58%)
GamA PyTorch implementation of "Graph Classification Using Structural Attention" (KDD 2018).
Stars: ✭ 227 (+1094.74%)
natural-language-joint-query-searchSearch photos on Unsplash based on OpenAI's CLIP model, support search with joint image+text queries and attention visualization.
Stars: ✭ 143 (+652.63%)
Pen Net For Inpainting[CVPR'2019]PEN-Net: Learning Pyramid-Context Encoder Network for High-Quality Image Inpainting
Stars: ✭ 206 (+984.21%)
jeelizGlanceTrackerJavaScript/WebGL lib: detect if the user is looking at the screen or not from the webcam video feed. Lightweight and robust to all lighting conditions. Great for play/pause videos if the user is looking or not, or for person detection. Link to live demo.
Stars: ✭ 68 (+257.89%)
Guided Attention Inference NetworkContains implementation of Guided Attention Inference Network (GAIN) presented in Tell Me Where to Look(CVPR 2018). This repository aims to apply GAIN on fcn8 architecture used for segmentation.
Stars: ✭ 204 (+973.68%)
AiROfficial Repository for ECCV 2020 paper "AiR: Attention with Reasoning Capability"
Stars: ✭ 41 (+115.79%)
HnattTrain and visualize Hierarchical Attention Networks
Stars: ✭ 192 (+910.53%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+536.84%)
Graph attention poolAttention over nodes in Graph Neural Networks using PyTorch (NeurIPS 2019)
Stars: ✭ 186 (+878.95%)
tensorflow-chatbot-chinese網頁聊天機器人 | tensorflow implementation of seq2seq model with bahdanau attention and Word2Vec pretrained embedding
Stars: ✭ 50 (+163.16%)
chatbot一个基于深度学习的中文聊天机器人,这里有详细的教程与代码,每份代码都有详细的注释,作为学习是美好的选择。A Chinese chatbot based on deep learning.
Stars: ✭ 94 (+394.74%)
Rnn For Joint NluPytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (+826.32%)
Im2LaTeXAn implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-15.79%)
Transformers.jlJulia Implementation of Transformer models
Stars: ✭ 173 (+810.53%)
TRAR-VQA[ICCV 2021] TRAR: Routing the Attention Spans in Transformers for Visual Question Answering -- Official Implementation
Stars: ✭ 49 (+157.89%)
attention-ocrA pytorch implementation of the attention based ocr
Stars: ✭ 44 (+131.58%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (+200%)
AttnSleep[IEEE TNSRE] "An Attention-based Deep Learning Approach for Sleep Stage Classification with Single-Channel EEG"
Stars: ✭ 76 (+300%)
how attentive are gatsCode for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (+952.63%)