SockeyeSequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+1550%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+545%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (+660%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+5596.67%)
minimal-nmtA minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-40%)
KospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (+216.67%)
Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (+735%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (+600%)
zeroZero -- A neural machine translation system
Stars: ✭ 121 (+101.67%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-53.33%)
Seq2seq基于Pytorch的中文聊天机器人 集成BeamSearch算法
Stars: ✭ 200 (+233.33%)
transformer-sltSign Language Translation with Transformers (COLING'2020, ECCV'20 SLRTP Workshop)
Stars: ✭ 92 (+53.33%)
attention-is-all-you-need-paperImplementation of Vaswani, Ashish, et al. "Attention is all you need." Advances in neural information processing systems. 2017.
Stars: ✭ 97 (+61.67%)
Tensorflow ShakespeareNeural machine translation between the writings of Shakespeare and modern English using TensorFlow
Stars: ✭ 244 (+306.67%)
Nn🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+9433.33%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+5976.67%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (+585%)
Seq2seqchatbotsA wrapper around tensor2tensor to flexibly train, interact, and generate data for neural chatbots.
Stars: ✭ 466 (+676.67%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-58.33%)
WitwickyWitwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-65%)
Seq2seq.pytorchSequence-to-Sequence learning using PyTorch
Stars: ✭ 514 (+756.67%)
LightseqLightSeq: A High Performance Inference Library for Sequence Processing and Generation
Stars: ✭ 501 (+735%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-40%)
Nspm🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (+160%)
Seq2seq chatbot new基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (+140%)
Nlp made easyExplains nlp building blocks in a simple manner.
Stars: ✭ 232 (+286.67%)
Openseq2seqToolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+2196.67%)
deep-molecular-optimizationMolecular optimization by capturing chemist’s intuition using the Seq2Seq with attention and the Transformer
Stars: ✭ 60 (+0%)
DabData Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (+390%)
XmunmtAn implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (+15%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+580%)
Transformers🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stars: ✭ 55,742 (+92803.33%)
Tensorflow Ml Nlp텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (+193.33%)
Nlp TutorialsSimple implementations of NLP models. Tutorials are written in Chinese on my website https://mofanpy.com
Stars: ✭ 394 (+556.67%)
Speech TransformerA PyTorch implementation of Speech Transformer, an End-to-End ASR with Transformer network on Mandarin Chinese.
Stars: ✭ 565 (+841.67%)
Transformer Temporal TaggerCode and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-8.33%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (+61.67%)
PaddlenlpNLP Core Library and Model Zoo based on PaddlePaddle 2.0
Stars: ✭ 212 (+253.33%)
transformerA simple TensorFlow implementation of the Transformer
Stars: ✭ 25 (-58.33%)
speech-transformerTransformer implementation speciaized in speech recognition tasks using Pytorch.
Stars: ✭ 40 (-33.33%)
beam searchBeam search for neural network sequence to sequence (encoder-decoder) models.
Stars: ✭ 31 (-48.33%)
Transformer DynetAn Implementation of Transformer (Attention Is All You Need) in DyNet
Stars: ✭ 57 (-5%)
Word-Level-Eng-Mar-NMTTranslating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (-38.33%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-28.33%)
Neural-Machine-TranslationSeveral basic neural machine translation models implemented by PyTorch & TensorFlow
Stars: ✭ 29 (-51.67%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-61.67%)
NmtpytorchSequence-to-Sequence Framework in PyTorch
Stars: ✭ 392 (+553.33%)
TS3000 TheChatBOTIts a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-66.67%)
dynmt-pyNeural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-71.67%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (+86.67%)