All Projects → roeeaharoni → dynmt-py

roeeaharoni / dynmt-py

Licence: other
Neural machine translation implementation using dynet's python bindings

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects
perl
6916 projects

Projects that are alternatives of or similar to dynmt-py

Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+5723.53%)
Mutual labels:  machine-translation, seq2seq, neural-machine-translation, sequence-to-sequence
Xmunmt
An implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (+305.88%)
Mutual labels:  seq2seq, neural-machine-translation, sequence-to-sequence
Machine Translation
Stars: ✭ 51 (+200%)
Mutual labels:  machine-translation, seq2seq, sequence-to-sequence
Npmt
Towards Neural Phrase-based Machine Translation
Stars: ✭ 175 (+929.41%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence
Word-Level-Eng-Mar-NMT
Translating English sentences to Marathi using Neural Machine Translation
Stars: ✭ 37 (+117.65%)
Mutual labels:  seq2seq, neural-machine-translation, sequence-to-sequence
Tf Seq2seq
Sequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+2176.47%)
Mutual labels:  seq2seq, neural-machine-translation, sequence-to-sequence
Openseq2seq
Toolkit for efficient experimentation with Speech Recognition, Text2Speech and NLP
Stars: ✭ 1,378 (+8005.88%)
Mutual labels:  seq2seq, neural-machine-translation, sequence-to-sequence
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (+2370.59%)
Mutual labels:  machine-translation, seq2seq, neural-machine-translation
Neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (+2252.94%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence
Nmt List
A list of Neural MT implementations
Stars: ✭ 359 (+2011.76%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+152.94%)
Mutual labels:  seq2seq, neural-machine-translation, sequence-to-sequence
Nspm
🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (+817.65%)
Mutual labels:  machine-translation, seq2seq, neural-machine-translation
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+20005.88%)
Mutual labels:  seq2seq, neural-machine-translation, sequence-to-sequence
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+2847.06%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence
Nematus
Open-Source Neural Machine Translation in Tensorflow
Stars: ✭ 730 (+4194.12%)
Mutual labels:  machine-translation, neural-machine-translation, sequence-to-sequence
Opennmt Tf
Neural machine translation and sequence learning using TensorFlow
Stars: ✭ 1,223 (+7094.12%)
Mutual labels:  machine-translation, neural-machine-translation
Mt Paper Lists
MT paper lists (by conference)
Stars: ✭ 105 (+517.65%)
Mutual labels:  machine-translation, neural-machine-translation
SequenceToSequence
A seq2seq with attention dialogue/MT model implemented by TensorFlow.
Stars: ✭ 11 (-35.29%)
Mutual labels:  machine-translation, seq2seq
Opus Mt
Open neural machine translation models and web services
Stars: ✭ 111 (+552.94%)
Mutual labels:  machine-translation, neural-machine-translation
Subword Nmt
Unsupervised Word Segmentation for Neural Machine Translation and Text Generation
Stars: ✭ 1,819 (+10600%)
Mutual labels:  machine-translation, neural-machine-translation

dynmt-py

Neural machine translation implementation using dynet's python bindings.

Example Usage:

python dynmt.py --dynet-autobatch 0 --dynet-devices GPU:1 --dynet-mem 12000 \
--input-dim=500 --hidden-dim=1024 --epochs=100 --lstm-layers=1 --optimization=ADADELTA \
--batch-size=60 --beam-size=5 --vocab 30000 --plot --eval-after=10000  \
train_source.txt train_target.txt dev_source.txt dev_target.txt test_source.txt test_target.txt path/to/model/dir

Options:

Name Description
-h --help shows a help message and exits
--dynet-mem MEM allocates MEM bytes for dynet (see dynet's documentation for more details)
--dynet-gpus GPUS how many gpus to use (see dynet's documentation for more details)
--dynet-devices DEV CPU/GPU ids to use (see dynet's documentation for more details)
--dynet-autobatch AUTO switch auto-batching on (see dynet's documentation for more details)
--input-dim=INPUT input embeddings dimension [default: 300]
--hidden-dim=HIDDEN LSTM hidden layer dimension [default: 100]
--epochs=EPOCHS amount of training epochs [default: 1]
--layers=LAYERS amount of layers in LSTM [default: 1]
--optimization=OPTIMIZATION chosen optimization method (ADAM/SGD/ADAGRAD/MOMENTUM/ADADELTA) [default: ADADELTA]
--reg=REGULARIZATION regularization parameter for optimization [default: 0]
--learning=LEARNING learning rate parameter for optimization [default: 0.0001]
--batch-size=BATCH batch size [default: 1]
--beam-size=BEAM beam size in beam search [default: 5]
--vocab-size=VOCAB max vocabulary size [default: 99999]
--eval-after=EVALAFTER amount of train batches to wait before evaluation [default: 1000]
--max-len=MAXLEN max train sequence length [default: 50]
--max-pred=MAXPRED max predicted sequence length [default: 50]
--grad-clip=GRADCLIP gradient clipping threshold [default: 5.0]
--max-patience=MAXPATIENCE amount of checkpoints without improvement on dev before early stopping [default: 100]
--plot plot a learning curve while training each model
--override override existing model with the same name, if exists
--ensemble=ENSEMBLE ensemble model paths separated by a comma
--last-state only use last encoder state
--eval skip training, do only evaluation

Arguments (must be given in this order):

Name Description
TRAIN_INPUTS_PATH train inputs path
TRAIN_OUTPUTS_PATH train outputs path
DEV_INPUTS_PATH development inputs path
DEV_OUTPUTS_PATH development outputs path
TEST_INPUTS_PATH test inputs path
TEST_OUTPUTS_PATH test outputs path
RESULTS_PATH results file path
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].