All Projects → klicperajo → Ppnp

klicperajo / Ppnp

Licence: mit
PPNP & APPNP models from "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019)

Projects that are alternatives of or similar to Ppnp

Gspan
Python implementation of frequent subgraph mining algorithm gSpan. Directed graphs are supported.
Stars: ✭ 103 (-41.81%)
Mutual labels:  graph-algorithms, jupyter-notebook
Kglab
Graph-Based Data Science: an abstraction layer in Python for building knowledge graphs, integrated with popular graph libraries – atop Pandas, RDFlib, pySHACL, RAPIDS, NetworkX, iGraph, PyVis, pslpython, pyarrow, etc.
Stars: ✭ 98 (-44.63%)
Mutual labels:  graph-algorithms, jupyter-notebook
Link Prediction
Representation learning for link prediction within social networks
Stars: ✭ 245 (+38.42%)
Mutual labels:  graph-algorithms, jupyter-notebook
Pytextrank
Python implementation of TextRank for phrase extraction and summarization of text documents
Stars: ✭ 1,675 (+846.33%)
Mutual labels:  graph-algorithms, jupyter-notebook
Neural image captioning
Neural image captioning (NIC) implementation with Keras 2.
Stars: ✭ 176 (-0.56%)
Mutual labels:  jupyter-notebook
Analytical Tutorials
Tutorials for writing analytical scripts
Stars: ✭ 175 (-1.13%)
Mutual labels:  jupyter-notebook
Intro Spacy Nlp
An introduction to using spaCy for NLP and machine learning
Stars: ✭ 175 (-1.13%)
Mutual labels:  jupyter-notebook
Python Machine Learning Book 3rd Edition
The "Python Machine Learning (3rd edition)" book code repository
Stars: ✭ 2,883 (+1528.81%)
Mutual labels:  jupyter-notebook
Jetbot
An educational AI robot based on NVIDIA Jetson Nano.
Stars: ✭ 2,391 (+1250.85%)
Mutual labels:  jupyter-notebook
Compact bilinear pooling
MatConvNet and Caffe repo with compact bilinear and bilinear pooling functionality added
Stars: ✭ 176 (-0.56%)
Mutual labels:  jupyter-notebook
Research
Open sourced research notebooks by the QuantConnect team.
Stars: ✭ 176 (-0.56%)
Mutual labels:  jupyter-notebook
Attentionn
All about attention in neural networks. Soft attention, attention maps, local and global attention and multi-head attention.
Stars: ✭ 175 (-1.13%)
Mutual labels:  jupyter-notebook
Debiaswe
Remove problematic gender bias from word embeddings.
Stars: ✭ 175 (-1.13%)
Mutual labels:  jupyter-notebook
Astropy Tutorials
Tutorials for the Astropy Project
Stars: ✭ 174 (-1.69%)
Mutual labels:  jupyter-notebook
Nel
Entity linking framework
Stars: ✭ 176 (-0.56%)
Mutual labels:  jupyter-notebook
Deep Learning With Keras Notebooks
Jupyter notebooks for using & learning Keras
Stars: ✭ 2,077 (+1073.45%)
Mutual labels:  jupyter-notebook
Reinforcement learning ai video games
Code for each week's short video of Siraj Raval Course on Reinforcement Learning "AI for Video Games"
Stars: ✭ 176 (-0.56%)
Mutual labels:  jupyter-notebook
Prefetch generator
Simple package that makes your generator work in background thread
Stars: ✭ 174 (-1.69%)
Mutual labels:  jupyter-notebook
Gluon Fashionai Attributes
Stars: ✭ 176 (-0.56%)
Mutual labels:  jupyter-notebook
Graphlayouts
new layout algorithms for network visualizations in R
Stars: ✭ 176 (-0.56%)
Mutual labels:  graph-algorithms

PPNP and APPNP

TensorFlow and PyTorch implementations of the model proposed in the paper:

Predict then Propagate: Graph Neural Networks meet Personalized PageRank
by Johannes Klicpera, Aleksandar Bojchevski, Stephan Günnemann
Published at ICLR 2019.

Run the code

The easiest way to get started is by looking at the notebook simple_example_tensorflow.ipynb or simple_example_pytorch.ipynb. The notebook reproduce_results.ipynb shows how to reproduce the results from the paper.

Requirements

The repository uses these packages:

numpy
scipy
tensorflow>=1.6,<2.0
pytorch>=1.5

You can install all requirements via pip install -r requirements.txt. However, in practice you will only need either TensorFlow or PyTorch, depending on which implementation you use. If you use the networkx_to_sparsegraph method for importing other datasets you will additionally need NetworkX.

Installation

To install the package, run python setup.py install.

Datasets

In the data folder you can find several datasets. If you want to use other (external) datasets, you can e.g. use the networkx_to_sparsegraph method in ppnp.data.io for converting NetworkX graphs to our SparseGraph format.

The Cora-ML graph was extracted by Aleksandar Bojchevski, and Stephan Günnemann. "Deep gaussian embedding of attributed graphs: Unsupervised inductive learning via ranking." ICLR 2018,
while the raw data was originally published by Andrew Kachites McCallum, Kamal Nigam, Jason Rennie, and Kristie Seymore. "Automating the construction of internet portals with machine learning." Information Retrieval, 3(2):127–163, 2000.

The Citeseer graph was originally published by Prithviraj Sen, Galileo Namata, Mustafa Bilgic, Lise Getoor, Brian Gallagher, and Tina Eliassi-Rad. "Collective Classification in Network Data." AI Magazine, 29(3):93–106, 2008.

The PubMed graph was originally published by Galileo Namata, Ben London, Lise Getoor, and Bert Huang. "Query-driven Active Surveying for Collective Classification". International Workshop on Mining and Learning with Graphs (MLG) 2012.

The Microsoft Academic graph was originally published by Oleksandr Shchur, Maximilian Mumme, Aleksandar Bojchevski, Stephan Günnemann. "Pitfalls of Graph Neural Network Evaluation". Relational Representation Learning Workshop (R2L), NeurIPS 2018.

Contact

Please contact [email protected] in case you have any questions.

Cite

Please cite our paper if you use the model or this code in your own work:

@inproceedings{klicpera_predict_2019,
	title = {Predict then Propagate: Graph Neural Networks meet Personalized PageRank},
	author = {Klicpera, Johannes and Bojchevski, Aleksandar and G{\"u}nnemann, Stephan},
	booktitle={International Conference on Learning Representations (ICLR)},
	year = {2019}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].