All Projects → wang-h → Variational-NMT

wang-h / Variational-NMT

Licence: other
Variational Neural Machine Translation System

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Variational-NMT

Quality-Estimation1
机器翻译子任务-翻译质量评价-复现 WMT2018 阿里论文结果
Stars: ✭ 19 (-48.65%)
Mutual labels:  nmt
RNNSearch
An implementation of attention-based neural machine translation using Pytorch
Stars: ✭ 43 (+16.22%)
Mutual labels:  nmt
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-51.35%)
Mutual labels:  variational-autoencoder
continuous Bernoulli
There are C language computer programs about the simulator, transformation, and test statistic of continuous Bernoulli distribution. More than that, the book contains continuous Binomial distribution and continuous Trinomial distribution.
Stars: ✭ 22 (-40.54%)
Mutual labels:  variational-autoencoder
adVAE
Implementation of 'Self-Adversarial Variational Autoencoder with Gaussian Anomaly Prior Distribution for Anomaly Detection'
Stars: ✭ 17 (-54.05%)
Mutual labels:  variational-autoencoder
multimodal-vae-public
A PyTorch implementation of "Multimodal Generative Models for Scalable Weakly-Supervised Learning" (https://arxiv.org/abs/1802.05335)
Stars: ✭ 98 (+164.86%)
Mutual labels:  variational-autoencoder
parallel-corpora-tools
Tools for filtering and cleaning parallel and monolingual corpora for machine translation and other natural language processing tasks.
Stars: ✭ 35 (-5.41%)
Mutual labels:  nmt
Quality-Estimation2
机器翻译子任务-翻译质量评价-在BERT模型后面加上Bi-LSTM进行fine-tuning
Stars: ✭ 31 (-16.22%)
Mutual labels:  nmt
shared-latent-space
Shared Latent Space VAE's
Stars: ✭ 15 (-59.46%)
Mutual labels:  variational-autoencoder
vae-pytorch
AE and VAE Playground in PyTorch
Stars: ✭ 53 (+43.24%)
Mutual labels:  variational-autoencoder
linguistic-style-transfer-pytorch
Implementation of "Disentangled Representation Learning for Non-Parallel Text Style Transfer(ACL 2019)" in Pytorch
Stars: ✭ 55 (+48.65%)
Mutual labels:  variational-autoencoder
STEP
Spatial Temporal Graph Convolutional Networks for Emotion Perception from Gaits
Stars: ✭ 39 (+5.41%)
Mutual labels:  variational-autoencoder
vaegan
An implementation of VAEGAN (variational autoencoder + generative adversarial network).
Stars: ✭ 88 (+137.84%)
Mutual labels:  variational-autoencoder
CVAE Dial
CVAE_XGate model in paper "Xu, Dusek, Konstas, Rieser. Better Conversations by Modeling, Filtering, and Optimizing for Coherence and Diversity"
Stars: ✭ 16 (-56.76%)
Mutual labels:  variational-autoencoder
gradient-boosted-normalizing-flows
We got a stew going!
Stars: ✭ 20 (-45.95%)
Mutual labels:  variational-autoencoder
playing with vae
Comparing FC VAE / FCN VAE / PCA / UMAP on MNIST / FMNIST
Stars: ✭ 53 (+43.24%)
Mutual labels:  variational-autoencoder
vae-torch
Variational autoencoder for anomaly detection (in PyTorch).
Stars: ✭ 38 (+2.7%)
Mutual labels:  variational-autoencoder
CIKM18-LCVA
Code for CIKM'18 paper, Linked Causal Variational Autoencoder for Inferring Paired Spillover Effects.
Stars: ✭ 13 (-64.86%)
Mutual labels:  variational-autoencoder
keras-adversarial-autoencoders
Experiments with Adversarial Autoencoders using Keras
Stars: ✭ 20 (-45.95%)
Mutual labels:  variational-autoencoder
VAE-Gumbel-Softmax
An implementation of a Variational-Autoencoder using the Gumbel-Softmax reparametrization trick in TensorFlow (tested on r1.5 CPU and GPU) in ICLR 2017.
Stars: ✭ 66 (+78.38%)
Mutual labels:  variational-autoencoder

Variational Neural Machine Translation System

Implemented by Pytorch 0.4, some modules references to OpenNMT-py.

References

  1. Su, Jinsong, et al. "Variational Recurrent Neural Machine Translation." arXiv preprint arXiv:1801.05119 (2018).

  2. Zhang, Biao, et al. "Variational neural machine translation." arXiv preprint arXiv:1605.07869 (2016)

Differences

For Variational NMT, I did not use the mean-pooling for both sides (source and target). I tested only using the last source hidden state is sufficient to achieve good performance.

For Variational Recurrent NMT, I tested only using the current RNN state is sufficient to achieve good performance.

The paper

Yang, Zichao, et al. "Improved variational autoencoders for text modeling using dilated convolutions." arXiv preprint arXiv:1702.08139 (2017).

explains the reason why use GRU instead of LSTM for building RNN cell, in general, VAE-LSTM-decoder performs worse than vanilla-LSTM-decoder.

Usage

Training

    python train.py --config config/nmt.ini

Test

    python translate.py --config config/nmt.ini    
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].