All Projects → A-Jacobson → minimal-nmt

A-Jacobson / minimal-nmt

Licence: MIT license
A minimal nmt example to serve as an seq2seq+attention reference.

Programming Languages

Jupyter Notebook
11667 projects
python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to minimal-nmt

TS3000 TheChatBOT
Its a social networking chat-bot trained on Reddit dataset . It supports open bounded queries developed on the concept of Neural Machine Translation. Beware of its being sarcastic just like its creator 😝 BDW it uses Pytorch framework and Python3.
Stars: ✭ 20 (-44.44%)
Mutual labels:  beam-search, neural-machine-translation, attention-mechanism
Seq2seq chatbot new
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 144 (+300%)
Mutual labels:  seq2seq, beam-search, attention-mechanism
Seq2seq chatbot
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
Stars: ✭ 308 (+755.56%)
Mutual labels:  seq2seq, beam-search, attention-mechanism
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+2650%)
Mutual labels:  seq2seq, neural-machine-translation, attention-mechanism
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (+341.67%)
Mutual labels:  seq2seq, beam-search, attention-mechanism
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+66.67%)
Mutual labels:  seq2seq, beam-search, neural-machine-translation
Tf Seq2seq
Sequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+975%)
Mutual labels:  seq2seq, beam-search, neural-machine-translation
Openseq2seq
Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+3727.78%)
Mutual labels:  seq2seq, neural-machine-translation
Awesome Speech Recognition Speech Synthesis Papers
Automatic Speech Recognition (ASR), Speaker Verification, Speech Synthesis, Text-to-Speech (TTS), Language Modelling, Singing Voice Synthesis (SVS), Voice Conversion (VC)
Stars: ✭ 2,085 (+5691.67%)
Mutual labels:  seq2seq, attention-mechanism
Nspm
🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (+333.33%)
Mutual labels:  seq2seq, neural-machine-translation
Tensorflow Shakespeare
Neural machine translation between the writings of Shakespeare and modern English using TensorFlow
Stars: ✭ 244 (+577.78%)
Mutual labels:  seq2seq, neural-machine-translation
Xmunmt
An implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (+91.67%)
Mutual labels:  seq2seq, neural-machine-translation
Seq2seq.pytorch
Sequence-to-Sequence learning using PyTorch
Stars: ✭ 514 (+1327.78%)
Mutual labels:  seq2seq, neural-machine-translation
Seq2seq
基于Pytorch的中文聊天机器人 集成BeamSearch算法
Stars: ✭ 200 (+455.56%)
Mutual labels:  seq2seq, beam-search
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+9394.44%)
Mutual labels:  seq2seq, neural-machine-translation
Nlp made easy
Explains nlp building blocks in a simple manner.
Stars: ✭ 232 (+544.44%)
Mutual labels:  seq2seq, beam-search
S2VT-seq2seq-video-captioning-attention
S2VT (seq2seq) video captioning with bahdanau & luong attention implementation in Tensorflow
Stars: ✭ 18 (-50%)
Mutual labels:  seq2seq, attention-mechanism
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-69.44%)
Mutual labels:  seq2seq, attention-mechanism
dynmt-py
Neural machine translation implementation using dynet's python bindings
Stars: ✭ 17 (-52.78%)
Mutual labels:  seq2seq, neural-machine-translation
Neural sp
End-to-end ASR/LM implementation with PyTorch
Stars: ✭ 408 (+1033.33%)
Mutual labels:  seq2seq, attention-mechanism

Minimal Neural Machine Translation

im

Resources

NEURAL MACHINE TRANSLATION BY JOINTLY LEARNING TO ALIGN AND TRANSLATE https://arxiv.org/pdf/1409.0473.pdf

Effective Approaches to Attention-based Neural Machine Translation https://arxiv.org/pdf/1508.04025.pdf

Massive Exploration of Neural Machine Translation Architectures https://arxiv.org/pdf/1703.03906.pdf

Contents

  • Encoder --> Attention --> Decoder Architecture.
  • Luong Attention.
  • Training on Multi30k German to English translation task.
  • Attention Visualization.
  • Teacher Forcing.
  • Greedy Decoding.
  • nmt tutorial notebook
  • minimal beam search decoding.

Setup

  1. install pytorch 0.4:
conda install pytorch -c pytorch=0.4.1 
  1. install other requirements:
pip install -r requirements.txt

System Requirements

Training with a batch size of 32 takes ~3gb GPU ram. If this is too much, lower the batch size or reduce network dimensionality in hyperparams.py.

Usage

python train.py

view logs in Tensorboard decent alignments should be seen after 2-3 epochs.

tensorboard --logdir runs

(partially trained attention heatmap)

img

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].