SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+97.6%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (-16.17%)
NematusOpen-Source Neural Machine Translation in Tensorflow
Stars: ✭ 730 (+45.71%)
Nmt ListA list of Neural MT implementations
Stars: ✭ 359 (-28.34%)
NeuralmonkeyAn open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (-20.16%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+582.24%)
parallel-corpora-toolsTools for filtering and cleaning parallel and monolingual corpora for machine translation and other natural language processing tasks.
Stars: ✭ 35 (-93.01%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (-22.75%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (-77.64%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-88.02%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-17.96%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-97.41%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+627.74%)
dynmt-pyNeural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-96.61%)
WitwickyWitwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-95.81%)
TransformerA Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (-45.91%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-91.42%)
XmunmtAn implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (-86.23%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-18.56%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-80.64%)
Subword NmtUnsupervised Word Segmentation for Neural Machine Translation and Text Generation
Stars: ✭ 1,819 (+263.07%)
NpmtTowards Neural Phrase-based Machine Translation
Stars: ✭ 175 (-65.07%)
Nmtpynmtpy is a Python framework based on dl4mt-tutorial to experiment with Neural Machine Translation pipelines.
Stars: ✭ 127 (-74.65%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-76.25%)
Eeg DlA Deep Learning library for EEG Tasks (Signals) Classification, based on TensorFlow.
Stars: ✭ 165 (-67.07%)
Neural-Machine-TranslationSeveral basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-94.21%)
TS3000 TheChatBOTIts a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-96.01%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-97.8%)
NmtpytorchSequence-to-Sequence Framework in PyTorch
Stars: ✭ 392 (-21.76%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-8.98%)
Word-Level-Eng-Mar-NMTTranslating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (-92.61%)
h-transformer-1dImplementation of H-Transformer-1D, Hierarchical Attention for Sequence Learning
Stars: ✭ 121 (-75.85%)
En-transformerImplementation of E(n)-Transformer, which extends the ideas of Welling's E(n)-Equivariant Graph Neural Network to attention
Stars: ✭ 131 (-73.85%)
Machine-Translation-Hindi-to-english-Machine translation is the task of converting one language to other. Unlike the traditional phrase-based translation system which consists of many small sub-components that are tuned separately, neural machine translation attempts to build and train a single, large neural network that reads a sentence and outputs a correct translation.
Stars: ✭ 19 (-96.21%)
Compact-Global-DescriptorPytorch implementation of "Compact Global Descriptor for Neural Networks" (CGD).
Stars: ✭ 22 (-95.61%)
Transformer-TransducerPyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (-87.82%)
dodrioExploring attention weights in transformer-based models with linguistic knowledge.
Stars: ✭ 233 (-53.49%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-95.41%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (-75.85%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-94.41%)
Transformer TtsA Pytorch Implementation of "Neural Speech Synthesis with Transformer Network"
Stars: ✭ 418 (-16.57%)
transformerA simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-95.01%)
CrabNetPredict materials properties using only the composition information!
Stars: ✭ 57 (-88.62%)
minimal-nmtA minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-92.81%)
OverlapPredator[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 293 (-41.52%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (-81.64%)
FragmentVCAny-to-any voice conversion by end-to-end extracting and fusing fine-grained voice fragments with attention
Stars: ✭ 134 (-73.25%)
visualizationa collection of visualization function
Stars: ✭ 189 (-62.28%)
enformer-pytorchImplementation of Enformer, Deepmind's attention network for predicting gene expression, in Pytorch
Stars: ✭ 146 (-70.86%)
Transformer-in-TransformerAn Implementation of Transformer in Transformer in TensorFlow for image classification, attention inside local patches
Stars: ✭ 40 (-92.02%)
theano-recurrenceRecurrent Neural Networks (RNN, GRU, LSTM) and their Bidirectional versions (BiRNN, BiGRU, BiLSTM) for word & character level language modelling in Theano
Stars: ✭ 40 (-92.02%)