All Projects → pfnet-research → vat_nmt

pfnet-research / vat_nmt

Licence: MIT license
Implementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to vat nmt

Tf Seq2seq
Sequence to sequence learning using TensorFlow.
Stars: ✭ 387 (+1659.09%)
Mutual labels:  neural-machine-translation, nmt
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+2177.27%)
Mutual labels:  neural-machine-translation, nmt
Nmtpytorch
Sequence-to-Sequence Framework in PyTorch
Stars: ✭ 392 (+1681.82%)
Mutual labels:  neural-machine-translation, nmt
parallel-corpora-tools
Tools for filtering and cleaning parallel and monolingual corpora for machine translation and other natural language processing tasks.
Stars: ✭ 35 (+59.09%)
Mutual labels:  neural-machine-translation, nmt
Njunmt Pytorch
Stars: ✭ 79 (+259.09%)
Mutual labels:  neural-machine-translation, nmt
Nmt List
A list of Neural MT implementations
Stars: ✭ 359 (+1531.82%)
Mutual labels:  neural-machine-translation, nmt
Joeynmt
Minimalist NMT for educational purposes
Stars: ✭ 420 (+1809.09%)
Mutual labels:  neural-machine-translation, nmt
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+95.45%)
Mutual labels:  neural-machine-translation, nmt
Xmunmt
An implementation of RNNsearch using TensorFlow
Stars: ✭ 69 (+213.64%)
Mutual labels:  neural-machine-translation, nmt
Rnn Nmt
基于双向RNN,Attention机制的编解码神经机器翻译模型
Stars: ✭ 46 (+109.09%)
Mutual labels:  neural-machine-translation, nmt
Neuralmonkey
An open-source tool for sequence learning in NLP built on TensorFlow.
Stars: ✭ 400 (+1718.18%)
Mutual labels:  neural-machine-translation, nmt
Nmtpy
nmtpy is a Python framework based on dl4mt-tutorial to experiment with Neural Machine Translation pipelines.
Stars: ✭ 127 (+477.27%)
Mutual labels:  neural-machine-translation, nmt
Nematus
Open-Source Neural Machine Translation in Tensorflow
Stars: ✭ 730 (+3218.18%)
Mutual labels:  neural-machine-translation, nmt
Njunmt Tf
An open-source neural machine translation system developed by Natural Language Processing Group, Nanjing University.
Stars: ✭ 97 (+340.91%)
Mutual labels:  neural-machine-translation, nmt
Subword Nmt
Unsupervised Word Segmentation for Neural Machine Translation and Text Generation
Stars: ✭ 1,819 (+8168.18%)
Mutual labels:  neural-machine-translation, nmt
Document Transformer
Improving the Transformer translation model with document-level context
Stars: ✭ 160 (+627.27%)
Mutual labels:  neural-machine-translation
semi-supervised-paper-implementation
Reproduce some methods in semi-supervised papers.
Stars: ✭ 35 (+59.09%)
Mutual labels:  vat
Mtbook
《机器翻译:基础与模型》肖桐 朱靖波 著 - Machine Translation: Foundations and Models
Stars: ✭ 2,307 (+10386.36%)
Mutual labels:  neural-machine-translation
Nspm
🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
Stars: ✭ 156 (+609.09%)
Mutual labels:  neural-machine-translation
vat-rates
💸 {Digital,Cloud,Electronic,Online} Services VAT Rate Database
Stars: ✭ 81 (+268.18%)
Mutual labels:  vat

Virtual Adversarial Training for NMT (Transformer model)

Implementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019

References

Motoki Sato, Jun Suzuki, Shun Kiyono. "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019 paper bib

How to use

Requirements

  • Python3.6+
  • Chainer 6.x+
  • Cupy 6.x+
# install chainer and cupy
$ pip install cupy
$ pip install chainer
$ pip install logzero

Please see how to install chainer: https://docs.chainer.org/en/stable/install.html

Train (iwslt2016-de-en)

$ python3 -u chainer_transformer.py --mode train --gpus 0 --dataset iwslt2016-de-en --seed 1212 --epoch 40 --out model_transformer_de-en

Train with VAT (iwslt2016-de-en)

$ python3 -u chainer_transformer.py --mode train --gpus 0 --dataset iwslt2016-de-en --seed 1212 --epoch 40 --out model_transformer_de-en_vat_enc --use-vat 1 --eps 1.0 --perturbation-target 0

perturbation types

perturbation-target (enc, dec, enc-dec)
0 enc
1 dec
0 1 enc-dec (both)

VAT, Adv, VAT-Adv

use-vat (vat, adv, vat-adv)
0 non (baseline)
1 vat
2 adv
3 vat-adv (both)

Eval

$ python3 -u chainer_transformer.py --mode test --gpus 0 --dataset iwslt2016-de-en --batchsize 600 --model model_transformer_de-en/model_epoch_40.npz --beam 20 --max-length 60 --datatype eval1

License

MIT License. Please see the LICENSE file for details.

Authors

We thank Takeru Miyato (@takerum), who gave us valuable comments about AdvT/VAT.

The codebase of the transformer is developed by Shun Kiyono (@butsugiri)

Contact

Please give me comments or questions: @aonotas

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].