All Projects → kkteru → grail

kkteru / grail

Licence: other
Inductive relation prediction by subgraph reasoning, ICML'20

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to grail

QGNN
Quaternion Graph Neural Networks (ACML 2021) (Pytorch and Tensorflow)
Stars: ✭ 31 (-62.65%)
Mutual labels:  graph-neural-networks, graph-representation-learning
3DInfomax
Making self-supervised learning work on molecules by using their 3D geometry to pre-train GNNs. Implemented in DGL and Pytorch Geometric.
Stars: ✭ 107 (+28.92%)
Mutual labels:  graph-neural-networks, graph-representation-learning
Graph Based Deep Learning Literature
links to conference publications in graph-based deep learning
Stars: ✭ 3,428 (+4030.12%)
Mutual labels:  graph-neural-networks, graph-representation-learning
awesome-efficient-gnn
Code and resources on scalable and efficient Graph Neural Networks
Stars: ✭ 498 (+500%)
Mutual labels:  graph-neural-networks, graph-representation-learning
SubGNN
Subgraph Neural Networks (NeurIPS 2020)
Stars: ✭ 136 (+63.86%)
Mutual labels:  graph-neural-networks, graph-representation-learning
SIAN
Code and data for ECML-PKDD paper "Social Influence Attentive Neural Network for Friend-Enhanced Recommendation"
Stars: ✭ 25 (-69.88%)
Mutual labels:  graph-neural-networks, heterogeneous-graph-neural-network
GNNLens2
Visualization tool for Graph Neural Networks
Stars: ✭ 155 (+86.75%)
Mutual labels:  graph-neural-networks, graph-representation-learning
GNN-Recommender-Systems
An index of recommendation algorithms that are based on Graph Neural Networks.
Stars: ✭ 505 (+508.43%)
Mutual labels:  graph-neural-networks, graph-representation-learning
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+98.8%)
Mutual labels:  graph-neural-networks, graph-representation-learning
SelfTask-GNN
Implementation of paper "Self-supervised Learning on Graphs:Deep Insights and New Directions"
Stars: ✭ 78 (-6.02%)
Mutual labels:  graph-neural-networks
AGCN
No description or website provided.
Stars: ✭ 17 (-79.52%)
Mutual labels:  graph-neural-networks
GCA
[WWW 2021] Source code for "Graph Contrastive Learning with Adaptive Augmentation"
Stars: ✭ 69 (-16.87%)
Mutual labels:  graph-representation-learning
gemnet pytorch
GemNet model in PyTorch, as proposed in "GemNet: Universal Directional Graph Neural Networks for Molecules" (NeurIPS 2021)
Stars: ✭ 80 (-3.61%)
Mutual labels:  graph-neural-networks
mtad-gat-pytorch
PyTorch implementation of MTAD-GAT (Multivariate Time-Series Anomaly Detection via Graph Attention Networks) by Zhao et. al (2020, https://arxiv.org/abs/2009.02040).
Stars: ✭ 85 (+2.41%)
Mutual labels:  graph-neural-networks
LogicCircuits.jl
Logic Circuits from the Juice library
Stars: ✭ 39 (-53.01%)
Mutual labels:  logical-reasoning
MTAG
Code for NAACL 2021 paper: MTAG: Modal-Temporal Attention Graph for Unaligned Human Multimodal Language Sequences
Stars: ✭ 23 (-72.29%)
Mutual labels:  graph-neural-networks
Walk-Transformer
From Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-68.67%)
Mutual labels:  graph-neural-networks
H-GCN
[IJCAI 2019] Source code and datasets for "Hierarchical Graph Convolutional Networks for Semi-supervised Node Classification"
Stars: ✭ 103 (+24.1%)
Mutual labels:  graph-neural-networks
NMN
Source code and datasets for ACL 2020 paper: Neighborhood Matching Network for Entity Alignment.
Stars: ✭ 55 (-33.73%)
Mutual labels:  graph-neural-networks
DIN-Group-Activity-Recognition-Benchmark
A new codebase for Group Activity Recognition. It contains codes for ICCV 2021 paper: Spatio-Temporal Dynamic Inference Network for Group Activity Recognition and some other methods.
Stars: ✭ 26 (-68.67%)
Mutual labels:  graph-neural-networks

GraIL - Graph Inductive Learning

This is the code necessary to run experiments on GraIL algorithm described in the ICML'20 paper Inductive relation prediction by subgraph reasoning.

Requiremetns

All the required packages can be installed by running pip install -r requirements.txt.

Inductive relation prediction experiments

All train-graph and ind-test-graph pairs of graphs can be found in the data folder. We use WN18RR_v1 as a runninng example for illustrating the steps.

GraIL

To start training a GraIL model, run the following command. python train.py -d WN18RR_v1 -e grail_wn_v1

To test GraIL run the following commands.

  • python test_auc.py -d WN18RR_v1_ind -e grail_wn_v1
  • python test_ranking.py -d WN18RR_v1_ind -e grail_wn_v1

The trained model and the logs are stored in experiments folder. Note that to ensure a fair comparison, we test all models on the same negative triplets. In order to do that in the current setup, we store the sampled negative triplets while evaluating GraIL and use these later to evaluate other baseline models.

RuleN

RuleN operates in two steps. Rules are first learned from a training graph and then applied on the test graph. Detailed instructions can be found here.

  • Learn rules: source learn_rules.sh WN18RR_v1
  • Apply rules:
    • To get AUC: source auc_apply_rules.sh WN18RR_v1 WN18RR_v1_ind num_of_samples_to_score(=1000)
    • To get ranking score: source auc_apply_rules.sh WN18RR_v1 WN18RR_v1_ind num_of_samples_to_score(=1000)

NeuralLP and Drum

We use the implementations provided by the authors of the respective papers to evaluate these models.

Transductive experiments

The full transductive datasets used in these experiments are present in the data folder.

GraIL

The training and testing protocols of GraIL remains the same.

KGE models

We use the comprehensive implementation provided by authors of RotatE. This gives state-of-the-art results on all datasets. The best configurations can be found here. To train these KGE models, navigate to the kge folder and run the commands as shown in the above reference. For example, to train TransE on FB237-15k, run the following command.

bash run.sh train TransE FB15k-237 0 0 1024 256 1000 9.0 1.0 0.00005 100000 16

This will store the trained model and the logs in a folder named experiments/kge_baselines/TransE_FB15k-237.

Ensembling instructions

Once the KGE models are trained, to get ensembling results with GraIL, navigate to the ensembling folder and run the following command. source get_ensemble_predictions.sh WN18RR TransE

To get ensenbling among different KGE models, from the ensembling folder run the following command. source get_kge_predictions.sh WN18RR TransE ComplEx

If you make use of this code or the GraIL algorithm in your work, please cite the following paper:

@article{Teru2020InductiveRP,
  title={Inductive Relation Prediction by Subgraph Reasoning.},
  author={Komal K. Teru and Etienne Denis and William L. Hamilton},
  journal={arXiv: Learning},
  year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].