All Projects → Graph_attention_pool → Similar Projects or Alternatives

6642 Open source projects that are alternatives of or similar to Graph_attention_pool

Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+120.97%)
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+388.17%)
Seq2seq Summarizer
Pointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (+64.52%)
Mutual labels:  attention-mechanism, attention
Show Attend And Tell
TensorFlow Implementation of "Show, Attend and Tell"
Stars: ✭ 869 (+367.2%)
Datastories Semeval2017 Task4
Deep-learning model presented in "DataStories at SemEval-2017 Task 4: Deep LSTM with Attention for Message-level and Topic-based Sentiment Analysis".
Stars: ✭ 184 (-1.08%)
Mutual labels:  attention-mechanism, attention
NTUA-slp-nlp
💻Speech and Natural Language Processing (SLP & NLP) Lab Assignments for ECE NTUA
Stars: ✭ 19 (-89.78%)
Mutual labels:  attention, attention-mechanism
Attention is all you need
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Stars: ✭ 303 (+62.9%)
Deeplearning.ai Natural Language Processing Specialization
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai
Stars: ✭ 473 (+154.3%)
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+193.55%)
Mutual labels:  attention-mechanism, attention
Attention Transfer
Improving Convolutional Networks via Attention Transfer (ICLR 2017)
Stars: ✭ 1,231 (+561.83%)
Mutual labels:  jupyter-notebook, attention
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-36.02%)
Absa keras
Keras Implementation of Aspect based Sentiment Analysis
Stars: ✭ 126 (-32.26%)
Mutual labels:  attention-mechanism, attention
automatic-personality-prediction
[AAAI 2020] Modeling Personality with Attentive Networks and Contextual Embeddings
Stars: ✭ 43 (-76.88%)
Mutual labels:  attention, attention-mechanism
ntua-slp-semeval2018
Deep-learning models of NTUA-SLP team submitted in SemEval 2018 tasks 1, 2 and 3.
Stars: ✭ 79 (-57.53%)
Mutual labels:  attention, attention-mechanism
Graph nn
Graph Classification with Graph Convolutional Networks in PyTorch (NeurIPS 2018 Workshop)
Stars: ✭ 268 (+44.09%)
Mutual labels:  graph, jupyter-notebook
Attention
一些不同的Attention机制代码
Stars: ✭ 17 (-90.86%)
Mutual labels:  attention, attention-mechanism
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+119.35%)
Mutual labels:  attention-mechanism, attention
Structured Self Attention
A Structured Self-attentive Sentence Embedding
Stars: ✭ 459 (+146.77%)
Mutual labels:  attention-mechanism, attention
Adjusttext
A small library for automatically adjustment of text position in matplotlib plots to minimize overlaps.
Stars: ✭ 731 (+293.01%)
Mutual labels:  graph, jupyter-notebook
Keras Gat
Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)
Stars: ✭ 334 (+79.57%)
Mutual labels:  graph, attention-mechanism
Group Level Emotion Recognition
Model submitted for the ICMI 2018 EmotiW Group-Level Emotion Recognition Challenge
Stars: ✭ 70 (-62.37%)
Nlp Tutorial
Natural Language Processing Tutorial for Deep Learning Researchers
Stars: ✭ 9,895 (+5219.89%)
Mutual labels:  jupyter-notebook, attention
Nlp Models Tensorflow
Gathers machine learning and Tensorflow deep learning models for NLP problems, 1.13 < Tensorflow < 2.0
Stars: ✭ 1,603 (+761.83%)
Mutual labels:  jupyter-notebook, attention
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-55.91%)
Mutual labels:  attention-mechanism, attention
Multihead Siamese Nets
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Stars: ✭ 144 (-22.58%)
Mutual labels:  jupyter-notebook, attention
Pytorch Question Answering
Important paper implementations for Question Answering using PyTorch
Stars: ✭ 154 (-17.2%)
Hey Jetson
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (-13.44%)
Mutual labels:  jupyter-notebook, attention
visualization
a collection of visualization function
Stars: ✭ 189 (+1.61%)
Mutual labels:  attention, attention-mechanism
datastories-semeval2017-task6
Deep-learning model presented in "DataStories at SemEval-2017 Task 6: Siamese LSTM with Attention for Humorous Text Comparison".
Stars: ✭ 20 (-89.25%)
Mutual labels:  attention, attention-mechanism
AoA-pytorch
A Pytorch implementation of Attention on Attention module (both self and guided variants), for Visual Question Answering
Stars: ✭ 33 (-82.26%)
Mutual labels:  attention, attention-mechanism
Linear-Attention-Mechanism
Attention mechanism
Stars: ✭ 27 (-85.48%)
Mutual labels:  attention, attention-mechanism
Graph Based Deep Learning Literature
links to conference publications in graph-based deep learning
Stars: ✭ 3,428 (+1743.01%)
Mutual labels:  graph, jupyter-notebook
Da Rnn
📃 **Unofficial** PyTorch Implementation of DA-RNN (arXiv:1704.02971)
Stars: ✭ 256 (+37.63%)
Adaptiveattention
Implementation of "Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning"
Stars: ✭ 303 (+62.9%)
Hierarchical-Word-Sense-Disambiguation-using-WordNet-Senses
Word Sense Disambiguation using Word Specific models, All word models and Hierarchical models in Tensorflow
Stars: ✭ 33 (-82.26%)
Mutual labels:  attention, attention-mechanism
Deep learning nlp
Keras, PyTorch, and NumPy Implementations of Deep Learning Architectures for NLP
Stars: ✭ 407 (+118.82%)
Mutual labels:  jupyter-notebook, attention
Action Recognition Visual Attention
Action recognition using soft attention based deep recurrent neural networks
Stars: ✭ 350 (+88.17%)
Ner Bert
BERT-NER (nert-bert) with google bert https://github.com/google-research.
Stars: ✭ 339 (+82.26%)
Mutual labels:  jupyter-notebook, attention
Network Analysis Made Simple
An introduction to network analysis and applied graph theory using Python and NetworkX
Stars: ✭ 700 (+276.34%)
Mutual labels:  graph, jupyter-notebook
Attentionn
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Stars: ✭ 175 (-5.91%)
Mutual labels:  jupyter-notebook, attention
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-88.71%)
Mutual labels:  attention-mechanism, attention
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-69.35%)
Mutual labels:  attention, attention-mechanism
Deeplearning Nlp Models
A small, interpretable codebase containing the re-implementation of a few "deep" NLP models in PyTorch. Colab notebooks to run with GPUs. Models: word2vec, CNNs, transformer, gpt.
Stars: ✭ 64 (-65.59%)
Mutual labels:  jupyter-notebook, attention
Global Self Attention Network
A Pytorch implementation of Global Self-Attention Network, a fully-attention backbone for vision tasks
Stars: ✭ 64 (-65.59%)
Mutual labels:  attention-mechanism, attention
Machine Learning
My Attempt(s) In The World Of ML/DL....
Stars: ✭ 78 (-58.06%)
Mutual labels:  jupyter-notebook, attention
Embedded gcnn
Embedded Graph Convolutional Neural Networks (EGCNN) in TensorFlow
Stars: ✭ 60 (-67.74%)
Mutual labels:  graph, jupyter-notebook
Bertqa Attention On Steroids
BertQA - Attention on Steroids
Stars: ✭ 112 (-39.78%)
Mutual labels:  jupyter-notebook, attention
Lambda Networks
Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute
Stars: ✭ 1,497 (+704.84%)
Mutual labels:  attention-mechanism, attention
Yolov3 Point
从零开始学习YOLOv3教程解读代码+注意力模块(SE,SPP,RFB etc)
Stars: ✭ 119 (-36.02%)
Attentional Interfaces
🔍 Attentional interfaces in TensorFlow.
Stars: ✭ 58 (-68.82%)
Prediction Flow
Deep-Learning based CTR models implemented by PyTorch
Stars: ✭ 138 (-25.81%)
Mutual labels:  attention-mechanism, attention
Abstractive Summarization
Implementation of abstractive summarization using LSTM in the encoder-decoder architecture with local attention.
Stars: ✭ 128 (-31.18%)
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (-14.52%)
Image Caption Generator
A neural network to generate captions for an image using CNN and RNN with BEAM Search.
Stars: ✭ 126 (-32.26%)
Mutual labels:  attention-mechanism, attention
Im2LaTeX
An implementation of the Show, Attend and Tell paper in Tensorflow, for the OpenAI Im2LaTeX suggested problem
Stars: ✭ 16 (-91.4%)
Mutual labels:  attention, attention-mechanism
h-transformer-1d
Implementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-34.95%)
Mutual labels:  attention, attention-mechanism
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-82.26%)
Mutual labels:  jupyter-notebook, attention
Chinese Chatbot
中文聊天机器人,基于10万组对白训练而成,采用注意力机制,对一般问题都会生成一个有意义的答复。已上传模型,可直接运行,跑不起来直播吃键盘。
Stars: ✭ 124 (-33.33%)
Mutual labels:  jupyter-notebook, attention
Multimodal Sentiment Analysis
Attention-based multimodal fusion for sentiment analysis
Stars: ✭ 172 (-7.53%)
Mutual labels:  attention-mechanism, attention
Rnn For Joint Nlu
Pytorch implementation of "Attention-Based Recurrent Neural Network Models for Joint Intent Detection and Slot Filling" (https://arxiv.org/abs/1609.01454)
Stars: ✭ 176 (-5.38%)
Mutual labels:  jupyter-notebook, attention
1-60 of 6642 similar projects