benedekrozemberczki / PDN

Licence: GPL-3.0 license
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to PDN

resolutions-2019
A list of data mining and machine learning papers that I implemented in 2019.
Stars: ✭ 19 (-56.82%)
Mutual labels:  network-science, deepwalk, gcn, graph-classification, node-classification
FEATHER
The reference implementation of FEATHER from the CIKM '20 paper "Characteristic Functions on Graphs: Birds of a Feather, from Statistical Descriptors to Parametric Models".
Stars: ✭ 34 (-22.73%)
Mutual labels:  deepwalk, graph-classification, node-classification, graph2vec
GraphDeeSmartContract
Smart contract vulnerability detection using graph neural network (DR-GCN).
Stars: ✭ 84 (+90.91%)
Mutual labels:  gcn, graph2vec, graph-neural-network
Awesome Graph Classification
A collection of important graph embedding, classification and representation learning papers with implementations.
Stars: ✭ 4,309 (+9693.18%)
Mutual labels:  deepwalk, graph-classification, graph2vec
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+275%)
Mutual labels:  molecules, message-passing, gnn
FasterTransformer
Transformer related optimization, including BERT, GPT
Stars: ✭ 1,571 (+3470.45%)
Mutual labels:  transformer, bert
tensorflow-ml-nlp-tf2
텐서플로2와 머신러닝으로 시작하는 자연어처리 (로지스틱회귀부터 BERT와 GPT3까지) 실습자료
Stars: ✭ 245 (+456.82%)
Mutual labels:  transformer, bert
KitanaQA
KitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (+31.82%)
Mutual labels:  transformer, bert
TDRG
Transformer-based Dual Relation Graph for Multi-label Image Recognition. ICCV 2021
Stars: ✭ 32 (-27.27%)
Mutual labels:  transformer, gcn
Graph Neural Net
Graph Convolutional Networks, Graph Attention Networks, Gated Graph Neural Net, Mixhop
Stars: ✭ 27 (-38.64%)
Mutual labels:  gcn, graph-neural-network
GNN-Recommender-Systems
An index of recommendation algorithms that are based on Graph Neural Networks.
Stars: ✭ 505 (+1047.73%)
Mutual labels:  gcn, gnn
NLP-paper
🎨 🎨NLP 自然语言处理教程 🎨🎨 https://dataxujing.github.io/NLP-paper/
Stars: ✭ 23 (-47.73%)
Mutual labels:  transformer, bert
ASAP
AAAI 2020 - ASAP: Adaptive Structure Aware Pooling for Learning Hierarchical Graph Representations
Stars: ✭ 83 (+88.64%)
Mutual labels:  gcn, graph-classification
dgcnn
Clean & Documented TF2 implementation of "An end-to-end deep learning architecture for graph classification" (M. Zhang et al., 2018).
Stars: ✭ 21 (-52.27%)
Mutual labels:  graph-classification, gnn
BGCN
A Tensorflow implementation of "Bayesian Graph Convolutional Neural Networks" (AAAI 2019).
Stars: ✭ 129 (+193.18%)
Mutual labels:  gcn, gnn
TabFormer
Code & Data for "Tabular Transformers for Modeling Multivariate Time Series" (ICASSP, 2021)
Stars: ✭ 209 (+375%)
Mutual labels:  transformer, bert
sticker2
Further developed as SyntaxDot: https://github.com/tensordot/syntaxdot
Stars: ✭ 14 (-68.18%)
Mutual labels:  transformer, bert
QGNN
Quaternion Graph Neural Networks (ACML 2021) (Pytorch and Tensorflow)
Stars: ✭ 31 (-29.55%)
Mutual labels:  graph-classification, node-classification
stagin
STAGIN: Spatio-Temporal Attention Graph Isomorphism Network
Stars: ✭ 34 (-22.73%)
Mutual labels:  gnn, graph-neural-network
les-military-mrc-rank7
莱斯杯:全国第二届“军事智能机器阅读”挑战赛 - Rank7 解决方案
Stars: ✭ 37 (-15.91%)
Mutual labels:  transformer, bert

PDN

Arxiv codebeat badge repo size benedekrozemberczki

A PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf 2021).

Abstract

In this work we propose Pathfinder Discovery Networks (PDNs), a method for jointly learning a message passing graph over a multiplex network with a downstream semi-supervised model. PDNs inductively learn an aggregated weight for each edge, optimized to produce the best outcome for the downstream learning task. PDNs are a generalization of attention mechanisms on graphs which allow flexible construction of similarity functions between nodes, edge convolutions, and cheap multiscale mixing layers. We show that PDNs overcome weaknesses of existing methods for graph attention (e.g. Graph Attention Networks), such as the diminishing weight problem. Our experimental results demonstrate competitive predictive performance on academic node classification tasks. Additional results from a challenging suite of node classification experiments show how PDNs can learn a wider class of functions than existing baselines. We analyze the relative computational complexity of PDNs, and show that PDN runtime is not considerably higher than static-graph models. Finally, we discuss how PDNs can be used to construct an easily interpretable attention mechanism that allows users to understand information propagation in the graph.

This repository provides a PyTorch implementation of PDN as described in the paper:

Pathfinder Discovery Networks for Neural Message Passing. Benedek Rozemberczki, Peter Englert, Amol Kapoor, Martin Blais, Bryan Perozzi. WebConf, 2021. [Paper]

Citing

If you find PDN useful in your research, please consider citing the following paper:

>@inproceedings{rozemberczki2021pdn,    
                title={{Pathfinder Discovery Networks for Neural Message Passing}},    
                author={Benedek Rozemberczki and Peter Englert and Amol Kapoor and Martin Blais and Bryan Perozzi},    
                booktitle = {Proceedings of The Web Conference 2021},
                year={2021},    
                organization={ACM}    
                }

Requirements

The codebase is implemented in Python 3.8.5. package versions used for development are just below.

tqdm               >=4.50.2
numpy              >=1.19.2
texttable          >=1.6.3
argparse           >=1.1.0
torch              >=1.7.1
torch-geometric    >=1.6.3
torch_spline_conv  >=1.2.0
torch_sparse       >=0.6.8
torch_scatter      >=2.0.5
torch_cluster      >=1.5.8

Options

The training of a PDN model is handled by the `src/main.py` script which provides the following command line arguments.

Input and output options

  --edge-path            STR    Edge list NumPy array.        Default is `input/edges.npy`.
  --node-features-path   STR    Node features NumPy array.    Default is `input/node_features.npy`.
  --edge-features-path   STR    Edge features NumPy array.    Default is `input/edge_features.npy`.
  --target-path          STR    Target classes NumPy array.   Default is `input/target.npy`.

Model options

  --seed                INT     Random seed.                   Default is 42.
  --epochs              INT     Number of training epochs.     Default is 200.
  --test-size           FLOAT   Training set ratio.            Default is 0.9.
  --learning-rate       FLOAT   Adam learning rate.            Default is 0.01.
  --edge-filters        INT     Number of PDN filters.         Default is 32.
  --node-filters        INT     Number of GCN filters.         Default is 32.

Examples

The following commands learn a neural network and score on the test set. Training a model on the default dataset.

$ python src/main.py

Training a PDN model for a 100 epochs.

$ python src/main.py --epochs 100

Training a model with a different layer structure:

$ python src/main.py --node-filters 16

License


Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].