All Projects → jcyk → Gtos

jcyk / Gtos

Licence: mit
Code for AAAI2020 paper "Graph Transformer for Graph-to-Sequence Learning"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Gtos

Thot
Thot toolkit for statistical machine translation
Stars: ✭ 53 (-58.91%)
Mutual labels:  machine-translation
Transformers without tears
Transformers without Tears: Improving the Normalization of Self-Attention
Stars: ✭ 80 (-37.98%)
Mutual labels:  machine-translation
Mt Paper Lists
MT paper lists (by conference)
Stars: ✭ 105 (-18.6%)
Mutual labels:  machine-translation
Comet
A Neural Framework for MT Evaluation
Stars: ✭ 58 (-55.04%)
Mutual labels:  machine-translation
Udacity Natural Language Processing Nanodegree
Tutorials and my solutions to the Udacity NLP Nanodegree
Stars: ✭ 73 (-43.41%)
Mutual labels:  machine-translation
Niutrans.smt
NiuTrans.SMT is an open-source statistical machine translation system developed by a joint team from NLP Lab. at Northeastern University and the NiuTrans Team. The NiuTrans system is fully developed in C++ language. So it runs fast and uses less memory. Currently it supports phrase-based, hierarchical phrase-based and syntax-based (string-to-tree, tree-to-string and tree-to-tree) models for research-oriented studies.
Stars: ✭ 90 (-30.23%)
Mutual labels:  machine-translation
Machine Translation
Stars: ✭ 51 (-60.47%)
Mutual labels:  machine-translation
Stog
AMR Parsing as Sequence-to-Graph Transduction
Stars: ✭ 123 (-4.65%)
Mutual labels:  amr
Chinesenlp
Datasets, SOTA results of every fields of Chinese NLP
Stars: ✭ 1,206 (+834.88%)
Mutual labels:  machine-translation
En Fr Mlt Tensorflow
English-French Machine Language Translation in Tensorflow
Stars: ✭ 99 (-23.26%)
Mutual labels:  machine-translation
Neuralamr
Sequence-to-sequence models for AMR parsing and generation
Stars: ✭ 60 (-53.49%)
Mutual labels:  amr
Trixi.jl
A tree-based numerical simulation framework for hyperbolic PDEs written in Julia
Stars: ✭ 72 (-44.19%)
Mutual labels:  amr
Deep Learning Drizzle
Drench yourself in Deep Learning, Reinforcement Learning, Machine Learning, Computer Vision, and NLP by learning from these exciting lectures!!
Stars: ✭ 9,717 (+7432.56%)
Mutual labels:  machine-translation
Rtpdump
Extract audio file from RTP streams in pcap format
Stars: ✭ 54 (-58.14%)
Mutual labels:  amr
Opus Mt
Open neural machine translation models and web services
Stars: ✭ 111 (-13.95%)
Mutual labels:  machine-translation
Fasttext multilingual
Multilingual word vectors in 78 languages
Stars: ✭ 1,067 (+727.13%)
Mutual labels:  machine-translation
Opennmt Tf
Neural machine translation and sequence learning using TensorFlow
Stars: ✭ 1,223 (+848.06%)
Mutual labels:  machine-translation
Cluedatasetsearch
搜索所有中文NLP数据集,附常用英文NLP数据集
Stars: ✭ 2,112 (+1537.21%)
Mutual labels:  machine-translation
Nonautoreggenprogress
Tracking the progress in non-autoregressive generation (translation, transcription, etc.)
Stars: ✭ 118 (-8.53%)
Mutual labels:  machine-translation
Rtlamr
An rtl-sdr receiver for Itron ERT compatible smart meters operating in the 900MHz ISM band.
Stars: ✭ 1,326 (+927.91%)
Mutual labels:  amr

Graph Transformer

Code for our AAAI2020 paper,

Graph Transformer for Graph-to-Sequence Learning. [preprint]

Deng Cai and Wai Lam.

1. Environment Setup

The code is tested with Python 3.6. All dependencies are listed in requirements.txt.

2. Data Preprocessing

The instructions for Syntax-based Machine Translation are given in the translator_data folder.

The instructions for AMR-to-Text Generation are given in the generator_data folder.


Step 3-6 should be conducted in the generator folder for AMR-to-Text Generation, and the translator folder for Syntax-based Machine Translation respectively. The default settings in this repo should reproduce the results in our paper.

3. Vocab & Data Preparation

cd generator/tranlator
sh prepare.sh # check it before use

4. Train

cd generator/tranlator
sh train.sh # check it before use

5. Test

cd generator/tranlator
sh work.sh # check it before use

# postprocess
sh test.sh (make sure --output is set)# check it before use

6. Evaluation

for bleu: use sh multi-bleu.perl (-lc)
for chrf++: use python chrF++.py (c6+w2-F2)
for meteor: use meteor-1.5 "java -Xmx2G -jar meteor-1.5.jar test reference -l en"

Citation

If you find the code useful, please cite our paper.

@inproceedings{cai-lam-2020-graph,
    title = "Graph Transformer for Graph-to-Sequence Learning",
    author = "Cai, Deng  and Lam, Wai",
    booktitle = "Proceedings of The Thirty-Fourth AAAI Conference on Artificial Intelligence (AAAI)",
    year = "2020",
}

Contact

For any questions, please drop an email to Deng Cai.

(Pretrained models and our system output are available upon request.)

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].