All Projects → Cartus → DCGCN

Cartus / DCGCN

Licence: MIT license
Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning (authors' MXNet implementation for the TACL19 paper)

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to DCGCN

GNN-Recommender-Systems
An index of recommendation algorithms that are based on Graph Neural Networks.
Stars: ✭ 505 (+591.78%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
SelfGNN
A PyTorch implementation of "SelfGNN: Self-supervised Graph Neural Networks without explicit negative sampling" paper, which appeared in The International Workshop on Self-Supervised Learning for the Web (SSL'21) @ the Web Conference 2021 (WWW'21).
Stars: ✭ 24 (-67.12%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
graphml-tutorials
Tutorials for Machine Learning on Graphs
Stars: ✭ 125 (+71.23%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
awesome-efficient-gnn
Code and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+582.19%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
Graph Based Deep Learning Literature
links to conference publications in graph-based deep learning
Stars: ✭ 3,428 (+4595.89%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
SimP-GCN
Implementation of the WSDM 2021 paper "Node Similarity Preserving Graph Convolutional Networks"
Stars: ✭ 43 (-41.1%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
Literatures-on-GNN-Acceleration
A reading list for deep graph learning acceleration.
Stars: ✭ 50 (-31.51%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
Representation Learning on Graphs with Jumping Knowledge Networks
Representation Learning on Graphs with Jumping Knowledge Networks
Stars: ✭ 31 (-57.53%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
Traffic-Prediction-Open-Code-Summary
Summary of open source code for deep learning models in the field of traffic prediction
Stars: ✭ 58 (-20.55%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
ProteinGCN
ProteinGCN: Protein model quality assessment using Graph Convolutional Networks
Stars: ✭ 88 (+20.55%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
Spectral-Designed-Graph-Convolutions
Codes for "Bridging the Gap Between Spectral and Spatial Domains in Graph Neural Networks" paper
Stars: ✭ 39 (-46.58%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
Stellargraph
StellarGraph - Machine Learning on Graphs
Stars: ✭ 2,235 (+2961.64%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
Pytorch geometric
Graph Neural Network Library for PyTorch
Stars: ✭ 13,359 (+18200%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
Euler
A distributed graph deep learning framework.
Stars: ✭ 2,701 (+3600%)
Mutual labels:  graph-convolutional-networks, graph-neural-networks
grb
Graph Robustness Benchmark: A scalable, unified, modular, and reproducible benchmark for evaluating the adversarial robustness of Graph Machine Learning.
Stars: ✭ 70 (-4.11%)
Mutual labels:  graph-neural-networks
demo-routenet
Demo of RouteNet in ACM SIGCOMM'19
Stars: ✭ 79 (+8.22%)
Mutual labels:  graph-neural-networks
how attentive are gats
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)
Stars: ✭ 200 (+173.97%)
Mutual labels:  graph-neural-networks
text gcn tutorial
A tutorial & minimal example (8min on CPU) for Graph Convolutional Networks for Text Classification. AAAI 2019
Stars: ✭ 23 (-68.49%)
Mutual labels:  graph-convolutional-networks
kaggle-champs
Code for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (-32.88%)
Mutual labels:  graph-neural-networks
RioGNN
Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural Networks
Stars: ✭ 46 (-36.99%)
Mutual labels:  graph-neural-networks

Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning

This paper/code introduces the Densely Connected Graph Convolutional Networks (DCGCNs) for the graph-to-sequence learning task. We evaluate our model on two tasks including AMR-to-Text Generation (AMR2015 and AMR2017) and Syntax-Based Machine Translation (EN2DE and EN2CS).

You can find the latest version of the TACL paper here.

This paper is presented in ACL 2019, you can find the video here in Session 6F: Machine Learning 4. Slides are also available (under the images directory) for brief introduction of this work.

See below for an overview of the encoder (DCGCNs) architecture: Each block has two sub-blocks. Both of them are densely connected graph convolutional layers with different numbers (n & m) of layers. For these two tasks we use n=6 and m=3. These are hyper-parameters.

Densely Connected Graph Convolutional Layers

Dependencies

The model requires:

Installation

GPU

If you want to run sockeye on a GPU you need to make sure your version of Apache MXNet Incubating contains the GPU bindings. Depending on your version of CUDA you can do this by running the following:

> pip install -r requirements/requirements.gpu-cu${CUDA_VERSION}.txt
> pip install .

where ${CUDA_VERSION} can be 75 (7.5), 80 (8.0), 90 (9.0), 91 (9.1), or 92 (9.2).

Preprocessing

We need to convert the dataset into extended Levi graphs for training. For details please refer to the paper.

Here, for AMR-to-text, get the AMR Sembank (LDC2017T10) first and put the folder called abstract_meaning_representation_amr_2.0 inside the data folder. Then run:

./gen_amr.sh

For NMT, you can download the raw dataset from here first and change the data folder inside nmt_preprocess.py. Then run:

python nmt_preprocess.py

Or you can download our preprocessed dataset (en2cs and en2de) for DCGCN model from here. For AMR corpus, it has LDC license so we cannot distribute the preprocessed data. If you have the license, feel free to drop us an email to get the preprocessed data.

Training

To train the DCGCN model, run (here we use AMR2015 as an example):

./train_amr15.sh

Model checkpoints and logs will be saved to ./sockeye/amr2015_model.

Decoding

When we finish the training, we can use the trained model to decode on the test set, run:

./decode_amr15.sh

This will use the last checkpoint (84th for AMR2015) by default. Use --checkpoints to specify a model checkpoint file.

Postprocessing

For AMR-to-Text generation, we also use the scope markers as in Konstas et al. (2017) and Beck et al. (2018). Basically, they conduct named entity anonymization and named entity clustering in the preprocessing stage. In the postprocessing state, we need to substitute the anonymized entities, run:

./postprocess_amr.sh

For Syntax-Based Machine Translation, we use BPE in the decoder side. In the postprocessing stage, we need to merge them into natural language sequence for evaluation, run:

./merge.sh

Evaluation

For BLEU score evaluation, run:

./eval_bleu.sh

Citation

@article{guo-etal-2019-densely,
    title = "Densely Connected Graph Convolutional Networks for Graph-to-Sequence Learning",
    author = "Guo, Zhijiang and Zhang, Yan and Teng, Zhiyang and Lu, Wei",
    journal = "Transactions of the Association for Computational Linguistics",
    volume = "7",
    month = mar,
    year = "2019",
    url = "https://www.aclweb.org/anthology/Q19-1019",
    pages = "297--312"
}

Pretrained Models

For pretrained models, please download the models here

After the download, unzip the file and put it under the sockeye directory.

For the AMR2015 dataset, the pretrained model achieves 25.9 BLEU score, while for the AMR2017 dataset, it achieves 27.9 BLEU score. You can train the model by yourself, the hyperparameters are given. The results should be the same.

Related Repo

This repo is built based on Graph-to-Sequence Learning using Gated Graph Neural Networks. DCGCNs can also be applied on other NLP tasks. For example, relation extraction: Attention Guided Graph Convolutional Networks for Relation Extraction.

Results

We also release the output of our model for AMR2015 and AMR2017 dataset (both dev and test set). Please refer to the results directory.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].