All Projects → jayparks → Transformer

jayparks / Transformer

A Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Transformer

Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+84.87%)
Mutual labels:  attention-mechanism, machine-translation, attention-is-all-you-need
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+265.31%)
Mutual labels:  attention-mechanism, machine-translation, attention-is-all-you-need
Machine Translation
Stars: ✭ 51 (-81.18%)
Mutual labels:  machine-translation, attention-is-all-you-need
Witwicky
Witwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-92.25%)
Mutual labels:  machine-translation, attention-is-all-you-need
Transformer
A TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+1245.39%)
Mutual labels:  attention-mechanism, attention-is-all-you-need
Pytorch Original Transformer
My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+51.66%)
Mutual labels:  attention-mechanism, attention-is-all-you-need
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-95.94%)
Mutual labels:  machine-translation, attention-mechanism
Transformers without tears
Transformers without Tears: Improving the Normalization of Self-Attention
Stars: ✭ 80 (-70.48%)
Mutual labels:  machine-translation, attention-is-all-you-need
Linear Attention Recurrent Neural Network
A recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-56.09%)
Mutual labels:  attention-mechanism, attention-is-all-you-need
Attention Mechanisms
Implementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (-25.09%)
Mutual labels:  attention-mechanism, machine-translation
Machine-Translation-Hindi-to-english-
Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-92.99%)
Mutual labels:  machine-translation, attention-mechanism
linformer
Implementation of Linformer for Pytorch
Stars: ✭ 119 (-56.09%)
Mutual labels:  attention-mechanism
Attention-Visualization
Visualization for simple attention and Google's multi-head attention.
Stars: ✭ 54 (-80.07%)
Mutual labels:  machine-translation
pynmt
a simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-95.2%)
Mutual labels:  attention-mechanism
transformer-pytorch
A PyTorch implementation of Transformer in "Attention is All You Need"
Stars: ✭ 77 (-71.59%)
Mutual labels:  machine-translation
Writing-editing-Network
Code for Paper Abstract Writing through Editing Mechanism
Stars: ✭ 72 (-73.43%)
Mutual labels:  attention-mechanism
transganformer
Implementation of TransGanFormer, an all-attention GAN that combines the finding from the recent GanFormer and TransGan paper
Stars: ✭ 137 (-49.45%)
Mutual labels:  attention-mechanism
Attention
一些不同的Attention机制代码
Stars: ✭ 17 (-93.73%)
Mutual labels:  attention-mechanism
co-attention
Pytorch implementation of "Dynamic Coattention Networks For Question Answering"
Stars: ✭ 54 (-80.07%)
Mutual labels:  attention-mechanism
keras attention
🔖 An Attention Layer in Keras
Stars: ✭ 43 (-84.13%)
Mutual labels:  attention-mechanism

A Pytorch Implementation of the Transformer Network

This repository includes pytorch implementations of "Attention is All You Need" (Vaswani et al., NIPS 2017) and "Weighted Transformer Network for Machine Translation" (Ahmed et al., arXiv 2017)

Reference

Paper

  • Vaswani et al., "Attention is All You Need", NIPS 2017
  • Ahmed et al., "Weighted Transformer Network for Machine Translation", Arxiv 2017

Code

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].