All Projects → ma-compbio → Hyper-SAGNN

ma-compbio / Hyper-SAGNN

Licence: MIT license
hypergraph representation learning, graph neural network

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Hyper-SAGNN

visdial-gnn
PyTorch code for Reasoning Visual Dialogs with Structural and Partial Observations
Stars: ✭ 39 (-26.42%)
Mutual labels:  graph-neural-network
xgi
CompleX Group Interactions (XGI) provides an ecosystem for the analysis and representation of complex systems with group interactions.
Stars: ✭ 48 (-9.43%)
Mutual labels:  hypergraphs
MixGCF
MixGCF: An Improved Training Method for Graph Neural Network-based Recommender Systems, KDD2021
Stars: ✭ 73 (+37.74%)
Mutual labels:  graph-neural-network
ReFine
Official code of "Towards Multi-Grained Explainability for Graph Neural Networks" (2021 NeurIPS)
Stars: ✭ 40 (-24.53%)
Mutual labels:  graph-neural-network
chemicalx
A PyTorch and TorchDrug based deep learning library for drug pair scoring.
Stars: ✭ 176 (+232.08%)
Mutual labels:  graph-neural-network
GP-GNN
Code and dataset of ACL2019 Paper: Graph Neural Networks with Generated Parameters for Relation Extraction.
Stars: ✭ 52 (-1.89%)
Mutual labels:  graph-neural-network
stagin
STAGIN: Spatio-Temporal Attention Graph Isomorphism Network
Stars: ✭ 34 (-35.85%)
Mutual labels:  graph-neural-network
chgl
Chapel HyperGraph Library (CHGL) - HPC-class Hypergraphs in Chapel
Stars: ✭ 23 (-56.6%)
Mutual labels:  hypergraphs
hypergraph
Hypergraph is data structure library to create a directed hypergraph in which a hyperedge can join any number of vertices.
Stars: ✭ 205 (+286.79%)
Mutual labels:  hypergraphs
Graph Neural Net
Graph Convolutional Networks, Graph Attention Networks, Gated Graph Neural Net, Mixhop
Stars: ✭ 27 (-49.06%)
Mutual labels:  graph-neural-network
DIG
A library for graph deep learning research
Stars: ✭ 1,078 (+1933.96%)
Mutual labels:  graph-neural-network
KERN
Code for Knowledge-Embedded Routing Network for Scene Graph Generation (CVPR 2019)
Stars: ✭ 99 (+86.79%)
Mutual labels:  graph-neural-network
GraphDeeSmartContract
Smart contract vulnerability detection using graph neural network (DR-GCN).
Stars: ✭ 84 (+58.49%)
Mutual labels:  graph-neural-network
GNNs-in-Network-Neuroscience
A review of papers proposing novel GNN methods with application to brain connectivity published in 2017-2020.
Stars: ✭ 92 (+73.58%)
Mutual labels:  graph-neural-network
egnn-pytorch
Implementation of E(n)-Equivariant Graph Neural Networks, in Pytorch
Stars: ✭ 249 (+369.81%)
Mutual labels:  graph-neural-network
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-16.98%)
Mutual labels:  graph-neural-network
GNN-Recommendation
毕业设计:基于图神经网络的异构图表示学习和推荐算法研究
Stars: ✭ 52 (-1.89%)
Mutual labels:  graph-neural-network
SuperGAT
[ICLR 2021] How to Find Your Friendly Neighborhood: Graph Attention Design with Self-Supervision
Stars: ✭ 122 (+130.19%)
Mutual labels:  graph-neural-network
deep-hyperedges
New Algorithms for Learning on Hypergraphs
Stars: ✭ 21 (-60.38%)
Mutual labels:  hypergraphs
Awesome-Federated-Learning-on-Graph-and-GNN-papers
Federated learning on graph, especially on graph neural networks (GNNs), knowledge graph, and private GNN.
Stars: ✭ 206 (+288.68%)
Mutual labels:  graph-neural-network

Hyper-SAGNN: a self-attention based graph neural network for hypergraphs

This is an implementation of "Hyper-SAGNN: a self-attention based graph neural network for hypergraphs" (ICLR2020)

The datasets included in this repo are originally from DHNE (https://github.com/tadpole/DHNE)

Requirements

python >= 3.6.8

Tensorflow >= 1.0.0 (< 2.0.0)

Pytorch >= 1.0

Usage

To run the code:

cd Code
python main.py --data wordnet -f adj

Change the following arguments to reproduce corresponding results from the manuscript,

The --data argument can take "GPS", "drug", "MovieLens", "wordnet". This argument is case sensitive

The -f, --feature argument can take "adj" or "walk" represents encoder based approach and random walk based approach respectively.

Other arguments are as followed:

parser.add_argument('--dimensions', type=int, default=64,
                    help='Number of dimensions. Default is 64.')

parser.add_argument('-l', '--walk-length', type=int, default=40,
                    help='Length of walk per source. Default is 40.')

parser.add_argument('-r', '--num-walks', type=int, default=10,
                    help='Number of walks per source. Default is 10.')

parser.add_argument('-k', '--window-size', type=int, default=10,
                    help='Context size for optimization. Default is 10.')

Cite

If you want to cite our paper:

@inproceedings{
zhang2020hypersagnn,
title={Hyper-{SAGNN}: a self-attention based graph neural network for hypergraphs},
author={Zhang, Ruochi and Zou, Yuesong and Ma, Jian},
booktitle={International Conference on Learning Representations (ICLR)},
year={2020}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].