All Projects → johnsmithm → multi-heads-attention-image-classification

johnsmithm / multi-heads-attention-image-classification

Licence: other
Multi heads attention for image classification

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to multi-heads-attention-image-classification

Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (+308.33%)
Mutual labels:  attention-is-all-you-need
Attention Is All You Need Keras
A Keras+TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 628 (+772.22%)
Mutual labels:  attention-is-all-you-need
Njunmt Pytorch
Stars: ✭ 79 (+9.72%)
Mutual labels:  attention-is-all-you-need
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+470.83%)
Mutual labels:  attention-is-all-you-need
Attention Is All You Need Pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Stars: ✭ 6,070 (+8330.56%)
Mutual labels:  attention-is-all-you-need
Witwicky
Witwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-70.83%)
Mutual labels:  attention-is-all-you-need
BangalASR
Transformer based Bangla Speech Recognition
Stars: ✭ 20 (-72.22%)
Mutual labels:  attention-is-all-you-need
Pytorch Transformer
pytorch implementation of Attention is all you need
Stars: ✭ 199 (+176.39%)
Mutual labels:  attention-is-all-you-need
Awesome Fast Attention
list of efficient attention modules
Stars: ✭ 627 (+770.83%)
Mutual labels:  attention-is-all-you-need
Transformers without tears
Transformers without Tears: Improving the Normalization of Self-Attention
Stars: ✭ 80 (+11.11%)
Mutual labels:  attention-is-all-you-need
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+595.83%)
Mutual labels:  attention-is-all-you-need
Speech Transformer
A PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+684.72%)
Mutual labels:  attention-is-all-you-need
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+1275%)
Mutual labels:  attention-is-all-you-need
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+4963.89%)
Mutual labels:  attention-is-all-you-need
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (+65.28%)
Mutual labels:  attention-is-all-you-need
Transformer
A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (+276.39%)
Mutual labels:  attention-is-all-you-need
Bert language understanding
Pre-training of Deep Bidirectional Transformers for Language Understanding: pre-train TextCNN
Stars: ✭ 933 (+1195.83%)
Mutual labels:  attention-is-all-you-need
pytorch-transformer
A PyTorch implementation of the Transformer model from "Attention Is All You Need".
Stars: ✭ 49 (-31.94%)
Mutual labels:  attention-is-all-you-need
Kospeech
Open-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (+163.89%)
Mutual labels:  attention-is-all-you-need
Machine Translation
Stars: ✭ 51 (-29.17%)
Mutual labels:  attention-is-all-you-need

Attention is all you need: A Keras Implementation

Using attention to increase image classification accuracy. Inspired from "Attention is All You Need" (Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Lukasz Kaiser, Illia Polosukhin, arxiv, 2017).

The medium article can be found here.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].