All Projects → DeepGraphLearning → NBFNet

DeepGraphLearning / NBFNet

Licence: MIT license
Official implementation of Neural Bellman-Ford Networks (NeurIPS 2021)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to NBFNet

Grakn
TypeDB: a strongly-typed database
Stars: ✭ 2,947 (+2680.19%)
Mutual labels:  knowledge-graph, reasoning
typedb
TypeDB: a strongly-typed database
Stars: ✭ 3,152 (+2873.58%)
Mutual labels:  knowledge-graph, reasoning
Stellargraph
StellarGraph - Machine Learning on Graphs
Stars: ✭ 2,235 (+2008.49%)
Mutual labels:  link-prediction, graph-neural-networks
PathCon
Combining relational context and relational paths for knowledge graph completion
Stars: ✭ 94 (-11.32%)
Mutual labels:  knowledge-graph, graph-neural-networks
kglib
TypeDB-ML is the Machine Learning integrations library for TypeDB
Stars: ✭ 523 (+393.4%)
Mutual labels:  knowledge-graph, link-prediction
PathCon
Combining relational context and relational paths for knowledge graph completion
Stars: ✭ 29 (-72.64%)
Mutual labels:  knowledge-graph, graph-neural-networks
KGPool
[ACL 2021] KGPool: Dynamic Knowledge Graph Context Selection for Relation Extraction
Stars: ✭ 33 (-68.87%)
Mutual labels:  knowledge-graph, graph-neural-networks
NMN
Source code and datasets for ACL 2020 paper: Neighborhood Matching Network for Entity Alignment.
Stars: ✭ 55 (-48.11%)
Mutual labels:  knowledge-graph, graph-neural-networks
KGReasoning
Multi-Hop Logical Reasoning in Knowledge Graphs
Stars: ✭ 197 (+85.85%)
Mutual labels:  knowledge-graph, reasoning
GraphScope
🔨 🍇 💻 🚀 GraphScope: A One-Stop Large-Scale Graph Computing System from Alibaba 来自阿里巴巴的一站式大规模图计算系统 图分析 图查询 图机器学习
Stars: ✭ 1,899 (+1691.51%)
Mutual labels:  graph-neural-networks
skipchunk
Extracts a latent knowledge graph from text and index/query it in elasticsearch or solr
Stars: ✭ 18 (-83.02%)
Mutual labels:  knowledge-graph
InfoGraph
Official code for "InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization" (ICLR 2020, spotlight)
Stars: ✭ 222 (+109.43%)
Mutual labels:  graph-neural-networks
LPGNN
Locally Private Graph Neural Networks (ACM CCS 2021)
Stars: ✭ 30 (-71.7%)
Mutual labels:  graph-neural-networks
Shukongdashi
使用知识图谱,自然语言处理,卷积神经网络等技术,基于python语言,设计了一个数控领域故障诊断专家系统
Stars: ✭ 109 (+2.83%)
Mutual labels:  knowledge-graph
ChineseTextAnalysisResouce
中文文本分析相关资源汇总
Stars: ✭ 71 (-33.02%)
Mutual labels:  knowledge-graph
CoLAKE
COLING'2020: CoLAKE: Contextualized Language and Knowledge Embedding
Stars: ✭ 86 (-18.87%)
Mutual labels:  knowledge-graph
graphml-tutorials
Tutorials for Machine Learning on Graphs
Stars: ✭ 125 (+17.92%)
Mutual labels:  graph-neural-networks
deepsphere-weather
A spherical CNN for weather forecasting
Stars: ✭ 44 (-58.49%)
Mutual labels:  graph-neural-networks
LambdaNet
Probabilistic Type Inference using Graph Neural Networks
Stars: ✭ 39 (-63.21%)
Mutual labels:  graph-neural-networks
graphchem
Graph-based machine learning for chemical property prediction
Stars: ✭ 21 (-80.19%)
Mutual labels:  graph-neural-networks

NBFNet: Neural Bellman-Ford Networks

This is the official codebase of the paper

Neural Bellman-Ford Networks: A General Graph Neural Network Framework for Link Prediction

Zhaocheng Zhu, Zuobai Zhang, Louis-Pascal Xhonneux, Jian Tang

A PyG re-implementation of NBFNet can be found here.

Overview

NBFNet is a graph neural network framework inspired by traditional path-based methods. It enjoys the advantages of both traditional path-based methods and modern graph neural networks, including generalization in the inductive setting, interpretability, high model capacity and scalability. NBFNet can be applied to solve link prediction on both homogeneous graphs and knowledge graphs.

NBFNet

This codebase is based on PyTorch and TorchDrug. It supports training and inference with multiple GPUs or multiple machines.

Installation

You may install the dependencies via either conda or pip. Generally, NBFNet works with Python 3.7/3.8 and PyTorch version >= 1.8.0.

From Conda

conda install torchdrug pytorch=1.8.2 cudatoolkit=11.1 -c milagraph -c pytorch-lts -c pyg -c conda-forge
conda install ogb easydict pyyaml -c conda-forge

From Pip

pip install torch==1.8.2+cu111 -f https://download.pytorch.org/whl/lts/1.8/torch_lts.html
pip install torchdrug
pip install ogb easydict pyyaml

Reproduction

To reproduce the results of NBFNet, use the following command. Alternatively, you may use --gpus null to run NBFNet on a CPU. All the datasets will be automatically downloaded in the code.

python script/run.py -c config/inductive/wn18rr.yaml --gpus [0] --version v1

We provide the hyperparameters for each experiment in configuration files. All the configuration files can be found in config/*/*.yaml.

For experiments on inductive relation prediction, you need to additionally specify the split version with --version v1.

To run NBFNet with multiple GPUs or multiple machines, use the following commands

python -m torch.distributed.launch --nproc_per_node=4 script/run.py -c config/inductive/wn18rr.yaml --gpus [0,1,2,3]
python -m torch.distributed.launch --nnodes=4 --nproc_per_node=4 script/run.py -c config/inductive/wn18rr.yaml --gpus[0,1,2,3,0,1,2,3,0,1,2,3,0,1,2,3]

Visualize Interpretations on FB15k-237

Once you have models trained on FB15k237, you can visualize the path interpretations with the following line. Please replace the checkpoint with your own path.

python script/visualize.py -c config/knowledge_graph/fb15k237_visualize.yaml --checkpoint /path/to/nbfnet/experiment/model_epoch_20.pth

Evaluate ogbl-biokg

Due to the large size of ogbl-biokg, we only evaluate on a small portion of the validation set during training. The following line evaluates a model on the full validation / test sets of ogbl-biokg. Please replace the checkpoint with your own path.

python script/run.py -c config/knowledge_graph/ogbl-biokg_test.yaml --checkpoint /path/to/nbfnet/experiment/model_epoch_10.pth

Results

Here are the results of NBFNet on standard benchmark datasets. All the results are obtained with 4 V100 GPUs (32GB). Note results may be slightly different if the model is trained with 1 GPU and/or a smaller batch size.

Knowledge Graph Completion

Dataset MR MRR HITS@1 HITS@3 HITS@10
FB15k-237 114 0.415 0.321 0.454 0.599
WN18RR 636 0.551 0.497 0.573 0.666
ogbl-biokg - 0.829 0.768 0.870 0.946

Homogeneous Graph Link Prediction

Dataset AUROC AP
Cora 0.956 0.962
CiteSeer 0.923 0.936
PubMed 0.983 0.982

Inductive Relation Prediction

Dataset HITS@10 (50 sample)
v1 v2 v3 v4
FB15k-237 0.834 0.949 0.951 0.960
WN18RR 0.948 0.905 0.893 0.890

Frequently Asked Questions

  1. The code is stuck at the beginning of epoch 0.

    This is probably because the JIT cache is broken. Try rm -r ~/.cache/torch_extensions/* and run the code again.

Citation

If you find this codebase useful in your research, please cite the following paper.

@article{zhu2021neural,
  title={Neural bellman-ford networks: A general graph neural network framework for link prediction},
  author={Zhu, Zhaocheng and Zhang, Zuobai and Xhonneux, Louis-Pascal and Tang, Jian},
  journal={Advances in Neural Information Processing Systems},
  volume={34},
  year={2021}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].