KospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
WitwickyWitwicky: An implementation of Transformer in PyTorch.
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Nmt KerasNeural Machine Translation with Keras
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
DabData Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
TransformerA Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
BangalASRTransformer based Bangla Speech Recognition
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
transformerNeutron: A pytorch based implementation of Transformer and its variants.
speech-transformerTransformer implementation speciaized in speech recognition tasks using Pytorch.
transformerA PyTorch Implementation of "Attention Is All You Need"
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
transformerA simple TensorFlow implementation of the Transformer
pytorch-transformerA PyTorch implementation of the Transformer model from "Attention Is All You Need".