SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+5110.53%)
Mutual labels: machine-translation, attention-mechanism
TransformerA Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (+1326.32%)
Mutual labels: machine-translation, attention-mechanism
Attention MechanismsImplementations for a family of attention mechanisms, suitable for all kinds of natural language processing tasks and compatible with TensorFlow 2.0 and Keras.
Stars: ✭ 203 (+968.42%)
Mutual labels: machine-translation, attention-mechanism
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+2536.84%)
Mutual labels: machine-translation, attention-mechanism
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-42.11%)
Mutual labels: machine-translation, attention-mechanism
SiGATsource code for signed graph attention networks (ICANN2019) & SDGNN (AAAI2021)
Stars: ✭ 37 (+94.74%)
Mutual labels: attention-mechanism
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (+589.47%)
Mutual labels: attention-mechanism
dgcnnClean & Documented TF2 implementation of "An end-to-end deep learning architecture for graph classification" (M. Zhang et al., 2018).
Stars: ✭ 21 (+10.53%)
Mutual labels: attention-mechanism
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (+536.84%)
Mutual labels: attention-mechanism
inmtInteractive Neural Machine Translation tool
Stars: ✭ 44 (+131.58%)
Mutual labels: machine-translation
MetricMTThe official code repository for MetricMT - a reward optimization method for NMT with learned metrics
Stars: ✭ 23 (+21.05%)
Mutual labels: machine-translation
rtgReader Translator Generator - NMT toolkit based on pytorch
Stars: ✭ 26 (+36.84%)
Mutual labels: machine-translation
LanguageModel-using-AttentionPytorch implementation of a basic language model using Attention in LSTM network
Stars: ✭ 27 (+42.11%)
Mutual labels: attention-mechanism
efficient-attentionAn implementation of the efficient attention module.
Stars: ✭ 191 (+905.26%)
Mutual labels: attention-mechanism
CIANImplementation of the Character-level Intra Attention Network (CIAN) for Natural Language Inference (NLI) upon SNLI and MultiNLI corpus
Stars: ✭ 17 (-10.53%)
Mutual labels: attention-mechanism
axial-attentionImplementation of Axial attention - attending to multi-dimensional data efficiently
Stars: ✭ 245 (+1189.47%)
Mutual labels: attention-mechanism
DCAN[AAAI 2020] Code release for "Domain Conditioned Adaptation Network" https://arxiv.org/abs/2005.06717
Stars: ✭ 27 (+42.11%)
Mutual labels: attention-mechanism
sktSanskrit compound segmentation using seq2seq model
Stars: ✭ 21 (+10.53%)
Mutual labels: machine-translation