Nmt KerasNeural Machine Translation with Keras
Stars: ✭ 501 (-49.39%)
Pytorch Seq2seqTutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+245.25%)
TransformerA TensorFlow Implementation of the Transformer: Attention Is All You Need
Stars: ✭ 3,646 (+268.28%)
JoeynmtMinimalist NMT for educational purposes
Stars: ✭ 420 (-57.58%)
Neural spEnd-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (-58.79%)
transformerNeutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (-93.94%)
dynmt-pyNeural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-98.28%)
Tf Seq2seqSequence to sequence learning using TensorFlow.
Stars: ✭ 387 (-60.91%)
NeuralmonkeyAn open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (-59.6%)
Gluon2pytorchGluon to PyTorch deep neural network model converter
Stars: ✭ 70 (-92.93%)
TransformerA Pytorch Implementation of "Attention is All You Need" and "Weighted Transformer Network for Machine Translation"
Stars: ✭ 271 (-72.63%)
SequenceToSequenceA seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-98.89%)
RNNSearchAn implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (-95.66%)
transformerA PyTorch Implementation of "Attention Is All You Need"
Stars: ✭ 28 (-97.17%)
DabData Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (-70.3%)
Nspm🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (-84.24%)
KospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition.
Stars: ✭ 190 (-80.81%)
Nmt ListA list of Neural MT implementations
Stars: ✭ 359 (-63.74%)
LingvoLingvo
Stars: ✭ 2,361 (+138.48%)
Speech recognition with tensorflowImplementation of a seq2seq model for Speech Recognition using the latest version of TensorFlow. Architecture similar to Listen, Attend and Spell.
Stars: ✭ 253 (-74.44%)
WitwickyWitwicky: An implementation of Transformer in PyTorch.
Stars: ✭ 21 (-97.88%)
Linear Attention Recurrent Neural NetworkA recurrent attention module consisting of an LSTM cell which can query its own past cell states by the means of windowed multi-head attention. The formulas are derived from the BN-LSTM and the Transformer Network. The LARNN cell with attention can be easily used inside a loop on the cell state, just like any other RNN. (LARNN)
Stars: ✭ 119 (-87.98%)
ModernmtNeural Adaptive Machine Translation that adapts to context and learns from corrections.
Stars: ✭ 231 (-76.67%)
Transformer Temporal TaggerCode and data form the paper BERT Got a Date: Introducing Transformers to Temporal Tagging
Stars: ✭ 55 (-94.44%)
Word-Level-Eng-Mar-NMTTranslating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (-96.26%)
Njunmt TfAn open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (-90.2%)
NLP-paper🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-97.68%)
kospeechOpen-Source Toolkit for End-to-End Korean Automatic Speech Recognition leveraging PyTorch and Hydra.
Stars: ✭ 456 (-53.94%)
NiuTrans.NMTA Fast Neural Machine Translation System. It is developed in C++ and resorts to NiuTensor for fast tensor APIs.
Stars: ✭ 112 (-88.69%)
Pytorch Original TransformerMy implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing otherwise seemingly hard concepts. Currently included IWSLT pretrained models.
Stars: ✭ 411 (-58.48%)
NpmtTowards Neural Phrase-based Machine Translation
Stars: ✭ 175 (-82.32%)
Openseq2seqToolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+39.19%)
Text summarization with tensorflowImplementation of a seq2seq model for summarization of textual data. Demonstrated on amazon reviews, github issues and news articles.
Stars: ✭ 226 (-77.17%)
XmunmtAn implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (-93.03%)
EmbeddingEmbedding模型代码和学习笔记总结
Stars: ✭ 25 (-97.47%)
Image-CaptionUsing LSTM or Transformer to solve Image Captioning in Pytorch
Stars: ✭ 36 (-96.36%)
Keras AttentionVisualizing RNNs using the attention mechanism
Stars: ✭ 697 (-29.6%)
minimal-nmtA minimal nmt example to serve as an seq2seq+attention reference.
Stars: ✭ 36 (-96.36%)
pynmta simple and complete pytorch implementation of neural machine translation system
Stars: ✭ 13 (-98.69%)
NematusOpen-Source Neural Machine Translation in Tensorflow
Stars: ✭ 730 (-26.26%)
Encoder decoderFour styles of encoder decoder model by Python, Theano, Keras and Seq2Seq
Stars: ✭ 269 (-72.83%)
Mxnet CenternetGluon implementation of "Objects as Points", aka "CenterNet"
Stars: ✭ 29 (-97.07%)
Gluon FaceAn unofficial Gluon FR Toolkit for face recognition. https://gluon-face.readthedocs.io
Stars: ✭ 264 (-73.33%)
Multi Scale AttentionCode for our paper "Multi-scale Guided Attention for Medical Image Segmentation"
Stars: ✭ 281 (-71.62%)
AutogluonAutoGluon: AutoML for Text, Image, and Tabular Data
Stars: ✭ 3,920 (+295.96%)
Seq2seq SummarizerPointer-generator reinforced seq2seq summarization in PyTorch
Stars: ✭ 306 (-69.09%)
Seq2seq chatbot基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (-68.89%)
Deep Learning In ProductionIn this repository, I will share some useful notes and references about deploying deep learning-based models in production.
Stars: ✭ 3,104 (+213.54%)
RezeroOfficial PyTorch Repo for "ReZero is All You Need: Fast Convergence at Large Depth"
Stars: ✭ 317 (-67.98%)
DeepchatmodelsConversation models in TensorFlow. (website removed)
Stars: ✭ 312 (-68.48%)