All Projects → sooftware → Attentions

sooftware / Attentions

Licence: mit
PyTorch implementation of some attentions for Deep Learning Researchers.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Attentions

Residual Attention Network
Residual Attention Network for Image Classification
Stars: ✭ 525 (+1246.15%)
Mutual labels:  attention
Nlp paper study
研读顶会论文,复现论文相关代码
Stars: ✭ 691 (+1671.79%)
Mutual labels:  attention
Isab Pytorch
An implementation of (Induced) Set Attention Block, from the Set Transformers paper
Stars: ✭ 21 (-46.15%)
Mutual labels:  attention
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+1348.72%)
Mutual labels:  attention
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+1507.69%)
Mutual labels:  attention
Spatial Transformer Network
A Tensorflow implementation of Spatial Transformer Networks.
Stars: ✭ 794 (+1935.9%)
Mutual labels:  attention
Chinesenre
中文实体关系抽取,pytorch,bilstm+attention
Stars: ✭ 463 (+1087.18%)
Mutual labels:  attention
Attentive Neural Processes
implementing "recurrent attentive neural processes" to forecast power usage (w. LSTM baseline, MCDropout)
Stars: ✭ 33 (-15.38%)
Mutual labels:  attention
Text Classification
Implementation of papers for text classification task on DBpedia
Stars: ✭ 682 (+1648.72%)
Mutual labels:  attention
Nlp tensorflow project
Use tensorflow to achieve some NLP project, eg: classification chatbot ner attention QAetc.
Stars: ✭ 27 (-30.77%)
Mutual labels:  attention
Attention Is All You Need Pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Stars: ✭ 6,070 (+15464.1%)
Mutual labels:  attention
Vad
Voice activity detection (VAD) toolkit including DNN, bDNN, LSTM and ACAM based VAD. We also provide our directly recorded dataset.
Stars: ✭ 622 (+1494.87%)
Mutual labels:  attention
Pytorch Gat
My implementation of the original GAT paper (Veličković et al.). I've additionally included the playground.py file for visualizing the Cora dataset, GAT embeddings, an attention mechanism, and entropy histograms. I've supported both Cora (transductive) and PPI (inductive) examples!
Stars: ✭ 908 (+2228.21%)
Mutual labels:  attention
Performer Pytorch
An implementation of Performer, a linear attention-based transformer, in Pytorch
Stars: ✭ 546 (+1300%)
Mutual labels:  attention
Banglatranslator
Bangla Machine Translator
Stars: ✭ 21 (-46.15%)
Mutual labels:  attention
Punctuator2
A bidirectional recurrent neural network model with attention mechanism for restoring missing punctuation in unsegmented text
Stars: ✭ 483 (+1138.46%)
Mutual labels:  attention
Tf Rnn Attention
Tensorflow implementation of attention mechanism for text classification tasks.
Stars: ✭ 735 (+1784.62%)
Mutual labels:  attention
Attentioncluster
TensorFlow Implementation of "Attention Clusters: Purely Attention Based Local Feature Integration for Video Classification"
Stars: ✭ 33 (-15.38%)
Mutual labels:  attention
Defactonlp
DeFactoNLP: An Automated Fact-checking System that uses Named Entity Recognition, TF-IDF vector comparison and Decomposable Attention models.
Stars: ✭ 30 (-23.08%)
Mutual labels:  attention
Cell Detr
Official and maintained implementation of the paper Attention-Based Transformers for Instance Segmentation of Cells in Microstructures [BIBM 2020].
Stars: ✭ 26 (-33.33%)
Mutual labels:  attention

An Apache 2.0 PyTorch implementation of some attentions for Deep Learning Researchers.


Intro

attentions provides some attentions used in natural language processing using pytorch.
these attentions can used in neural machine translation, speech recognition, image captioning etc...

image

attention allows to attend to different parts of the source sentence at each step of the output generation.
Instead of encoding the input sequence into a single fixed context vector, we let the model learn how to generate a context vector for each output time step.

Implementation list

Name Citation
Additive Attention Bahdanau et al., 2015
Dot-Product Attention Luong et al., 2015
Location-Aware (Location Sensitive) Attention Chorowski et al., 2015
Scaled Dot-Product Attention Vaswani et al., 2017
Multi-Head Attention Vaswani et al., 2017
Relative Multi-Head Self Attention ZihangDai et al., 2019

Troubleshoots and Contributing

If you have any questions, bug reports, and feature requests, please open an issue on Github.
or Contacts [email protected] please.

I appreciate any kind of feedback or contribution. Feel free to proceed with small issues like bug fixes, documentation improvement. For major contributions and new features, please discuss with the collaborators in corresponding issues.

Code Style

I follow PEP-8 for code style. Especially the style of docstrings is important to generate documentation.

Author

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].