All Projects → DeepLearnXMU → VNMT

DeepLearnXMU / VNMT

Licence: other
Code for "Variational Neural Machine Translation" (EMNLP2016)

Programming Languages

python
139335 projects - #7 most used programming language
prolog
421 projects
shell
77523 projects

Projects that are alternatives of or similar to VNMT

Nmtpy
nmtpy is a Python framework based on dl4mt-tutorial to experiment with Neural Machine Translation pipelines.
Stars: ✭ 127 (+135.19%)
Mutual labels:  theano, nmt
Nmt Keras
Neural Machine Translation with Keras
Stars: ✭ 501 (+827.78%)
Mutual labels:  theano, nmt
Deep Svdd
Repository for the Deep One-Class Classification ICML 2018 paper
Stars: ✭ 159 (+194.44%)
Mutual labels:  theano
Dandelion
A light weight deep learning framework, on top of Theano, offering better balance between flexibility and abstraction
Stars: ✭ 15 (-72.22%)
Mutual labels:  theano
Rnn ctc
Recurrent Neural Network and Long Short Term Memory (LSTM) with Connectionist Temporal Classification implemented in Theano. Includes a Toy training example.
Stars: ✭ 220 (+307.41%)
Mutual labels:  theano
Pixelcnn
Theano reimplementation of pixelCNN architecture
Stars: ✭ 170 (+214.81%)
Mutual labels:  theano
SemiDenseNet
Repository containing the code of one of the networks that we employed in the iSEG Grand MICCAI Challenge 2017, infant brain segmentation.
Stars: ✭ 55 (+1.85%)
Mutual labels:  theano
Aesara
Aesara is a fork of the Theano library that is maintained by the PyMC developers. It was previously named Theano-PyMC.
Stars: ✭ 145 (+168.52%)
Mutual labels:  theano
rnn benchmarks
RNN benchmarks of pytorch, tensorflow and theano
Stars: ✭ 85 (+57.41%)
Mutual labels:  theano
Keras Gp
Keras + Gaussian Processes: Learning scalable deep and recurrent kernels.
Stars: ✭ 218 (+303.7%)
Mutual labels:  theano
email-summarization
A module for E-mail Summarization which uses clustering of skip-thought sentence embeddings.
Stars: ✭ 81 (+50%)
Mutual labels:  theano
Cnn Text Classification Keras
Text Classification by Convolutional Neural Network in Keras
Stars: ✭ 213 (+294.44%)
Mutual labels:  theano
Opt Mmd
Learning kernels to maximize the power of MMD tests
Stars: ✭ 181 (+235.19%)
Mutual labels:  theano
vat nmt
Implementation of "Effective Adversarial Regularization for Neural Machine Translation", ACL 2019
Stars: ✭ 22 (-59.26%)
Mutual labels:  nmt
Neuralnets
Deep Learning libraries tested on images and time series
Stars: ✭ 163 (+201.85%)
Mutual labels:  theano
seq2seq-autoencoder
Theano implementation of Sequence-to-Sequence Autoencoder
Stars: ✭ 12 (-77.78%)
Mutual labels:  theano
Mariana
The Cutest Deep Learning Framework which is also a wonderful Declarative Language
Stars: ✭ 151 (+179.63%)
Mutual labels:  theano
Alphazero gomoku
An implementation of the AlphaZero algorithm for Gomoku (also called Gobang or Five in a Row)
Stars: ✭ 2,570 (+4659.26%)
Mutual labels:  theano
Deepjazz
Deep learning driven jazz generation using Keras & Theano!
Stars: ✭ 2,766 (+5022.22%)
Mutual labels:  theano
symbolic-pymc
Tools for the symbolic manipulation of PyMC models, Theano, and TensorFlow graphs.
Stars: ✭ 58 (+7.41%)
Mutual labels:  theano

VNMT

Current implementation for VNMT only support 1 layer NMT! Deep layers are meaningless.

Source code for variational neural machine translation, we will make it available soon!

If you use this code, please cite our paper:

@InProceedings{zhang-EtAl:2016:EMNLP20162,
  author    = {Zhang, Biao  and  Xiong, Deyi  and  su, jinsong  and  Duan, Hong  and  Zhang, Min},
  title     = {Variational Neural Machine Translation},
  booktitle = {Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing},
  month     = {November},
  year      = {2016},
  address   = {Austin, Texas},
  publisher = {Association for Computational Linguistics},
  pages     = {521--530},
  url       = {https://aclweb.org/anthology/D16-1050}
}

Basic Requirement

Our source is based on the GroundHog. Before use our code, please install it.

How to Run?

To train a good VNMT model, you need follow two steps.

Step 1. Pretraining

Pretrain a base model using the GroundHog.

Step 2. Retraining

Go to the work directory, and put the pretrained model to this directory, i.e. use the pretrained model to initialize the parameters of VNMT.

Simply Run (Clearly, before that you need re-config the chinese.py file to your own dataset :))

run.sh

That's it!

Notice that our test and deveopment set is the NIST dataset, which follow the sgm format! Please see the work/data/dev for example.

For any comments or questions, please email Biao Zhang.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].