SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+1841.18%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+882.35%)
dynmt-pyNeural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-66.67%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+17.65%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+700%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-45.1%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+6601.96%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+723.53%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+794.12%)
KospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (+272.55%)
WitwickyWitwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-58.82%)
Text summarization with tensorflowImplementation of a seq2seq model for summarization of textual data. Demonstrated on amazon reviews, github issues and news articles.
Stars: ✭ 226 (+343.14%)
Speech recognition with tensorflowImplementation of a seq2seq model for Speech Recognition using the latest version of TensorFlow. Architecture similar to Listen, Attend and Spell.
Stars: ✭ 253 (+396.08%)
transformerBuild English-Vietnamese machine translation with ProtonX Transformer. :D
Stars: ✭ 41 (-19.61%)
transformerA simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-50.98%)
seq2seq-pytorchSequence to Sequence Models in PyTorch
Stars: ✭ 41 (-19.61%)
parse seq2seqA tensorflow implementation of neural sequence-to-sequence parser for converting natural language queries to logical form.
Stars: ✭ 26 (-49.02%)
sktSanskrit compound segmentation using seq2seq model
Stars: ✭ 21 (-58.82%)
Transformer-TransducerPyTorch implementation of "Transformer Transducer: A Streamable Speech Recognition Model with Transformer Encoders and RNN-T Loss" (ICASSP 2020)
Stars: ✭ 61 (+19.61%)
CVAE DialCVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-68.63%)
speech-transformerTransformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-21.57%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-78.43%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+119.61%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-15.69%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-50.98%)
PaddlenlpNLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (+315.69%)
Cluener2020CLUENER2020 中文细粒度命名实体识别 Fine Grained Named Entity Recognition
Stars: ✭ 689 (+1250.98%)
LingvoLingvo
Stars: ✭ 2,361 (+4529.41%)
Transformer Temporal TaggerCode and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (+7.84%)
sb-nmtCode for Synchronous Bidirectional Neural Machine Translation (SB-NMT)
Stars: ✭ 66 (+29.41%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+1007.84%)
Turbotransformersa fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Stars: ✭ 826 (+1519.61%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-54.9%)
Word-Level-Eng-Mar-NMTTranslating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (-27.45%)
Seq2seqMinimal Seq2Seq model with Attention for Neural Machine Translation in PyTorch
Stars: ✭ 552 (+982.35%)
Tensorflow Ml Nlp텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+245.1%)
DabData Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (+476.47%)
Nmt ListA list of Neural MT implementations
Stars: ✭ 359 (+603.92%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+658.82%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (+90.2%)
classyclassy is a simple-to-use library for building high-performance Machine Learning models in NLP.
Stars: ✭ 61 (+19.61%)
TransformerA Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (+431.37%)
Natural-Language-ProcessingContains various architectures and novel paper implementations for Natural Language Processing tasks like Sequence Modelling and Neural Machine Translation.
Stars: ✭ 48 (-5.88%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+7049.02%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (+17.65%)
NeuralmonkeyAn open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (+684.31%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+705.88%)
Athenaan open-source implementation of sequence-to-sequence based speech processing engine
Stars: ✭ 542 (+962.75%)
Nspm🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (+205.88%)
Spark NlpState of the Art Natural Language Processing
Stars: ✭ 2,518 (+4837.25%)
Nlp TutorialsSimple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+672.55%)
Seq2seqchatbotsA wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+813.73%)