All Projects → ucbrise → graphtrans

ucbrise / graphtrans

Licence: Apache-2.0 license
Representing Long-Range Context for Graph Neural Networks with Global Attention

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to graphtrans

Walk-Transformer
From Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)
Stars: ✭ 26 (-42.22%)
Mutual labels:  transformer, graph-neural-networks
well-classified-examples-are-underestimated
Code for the AAAI 2022 publication "Well-classified Examples are Underestimated in Classification with Deep Neural Networks"
Stars: ✭ 21 (-53.33%)
Mutual labels:  transformer, graph-neural-networks
kaggle-champs
Code for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (+8.89%)
Mutual labels:  transformer, graph-neural-networks
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (+266.67%)
Mutual labels:  graph-neural-networks
COVID-19-Tweet-Classification-using-Roberta-and-Bert-Simple-Transformers
Rank 1 / 216
Stars: ✭ 24 (-46.67%)
Mutual labels:  transformer
GeometricFlux.jl
Geometric Deep Learning for Flux
Stars: ✭ 288 (+540%)
Mutual labels:  graph-neural-networks
Xpersona
XPersona: Evaluating Multilingual Personalized Chatbot
Stars: ✭ 54 (+20%)
Mutual labels:  transformer
GTSRB Keras STN
German Traffic Sign Recognition Benchmark, Keras implementation with Spatial Transformer Networks
Stars: ✭ 48 (+6.67%)
Mutual labels:  transformer
tf2-transformer-chatbot
Transformer Chatbot in TensorFlow 2 with TPU support.
Stars: ✭ 94 (+108.89%)
Mutual labels:  transformer
eeg-gcnn
Resources for the paper titled "EEG-GCNN: Augmenting Electroencephalogram-based Neurological Disease Diagnosis using a Domain-guided Graph Convolutional Neural Network". Accepted for publication (with an oral spotlight!) at ML4H Workshop, NeurIPS 2020.
Stars: ✭ 50 (+11.11%)
Mutual labels:  graph-neural-networks
transform-graphql
⚙️ Transformer function to transform GraphQL Directives. Create model CRUD directive for example
Stars: ✭ 23 (-48.89%)
Mutual labels:  transformer
BossNAS
(ICCV 2021) BossNAS: Exploring Hybrid CNN-transformers with Block-wisely Self-supervised Neural Architecture Search
Stars: ✭ 125 (+177.78%)
Mutual labels:  transformer
image-classification
A collection of SOTA Image Classification Models in PyTorch
Stars: ✭ 70 (+55.56%)
Mutual labels:  transformer
OpenPrompt
An Open-Source Framework for Prompt-Learning.
Stars: ✭ 1,769 (+3831.11%)
Mutual labels:  transformer
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (-2.22%)
Mutual labels:  transformer
graph-transformer-pytorch
Implementation of Graph Transformer in Pytorch, for potential use in replicating Alphafold2
Stars: ✭ 81 (+80%)
Mutual labels:  transformer
KGPool
[ACL 2021] KGPool: Dynamic Knowledge Graph Context Selection for Relation Extraction
Stars: ✭ 33 (-26.67%)
Mutual labels:  graph-neural-networks
YOLOv5-Lite
🍅🍅🍅YOLOv5-Lite: lighter, faster and easier to deploy. Evolved from yolov5 and the size of model is only 930+kb (int8) and 1.7M (fp16). It can reach 10+ FPS on the Raspberry Pi 4B when the input size is 320×320~
Stars: ✭ 1,230 (+2633.33%)
Mutual labels:  transformer
TransPose
PyTorch Implementation for "TransPose: Keypoint localization via Transformer", ICCV 2021.
Stars: ✭ 250 (+455.56%)
Mutual labels:  transformer
towhee
Towhee is a framework that is dedicated to making neural data processing pipelines simple and fast.
Stars: ✭ 821 (+1724.44%)
Mutual labels:  transformer

Representing Long-Range Context for Graph Neural Networks with Global Attention

@inproceedings{Wu2021GraphTrans,
  title={Representing Long-Range Context for Graph Neural Networks with Global Attention},
  author={Wu, Zhanghao and Jain, Paras and Wright, Matthew and Mirhoseini, Azalia and Gonzalez, Joseph E and Stoica, Ion},
  booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
  year={2021}
}

Overview

We release the PyTorch code for the GraphTrans [paper]

Installation

To setup the Python environment, please install conda first. All the required environments are in requirement.yml.

conda env create -f requirement.yml

How to Run

To run the experiments, please refer to the commands below (taking OGBG-Code2 as an example):

# GraphTrans (GCN-Virtual)
python main.py --configs configs/code2/gnn-transformer/JK=cat/pooling=cls+norm_input.yml --runs 5
# GraphTrans (GCN)
python main.py --configs configs/code2/gnn-transformer/no-virtual/pooling=cls+norm_input.yml --runs 5
# Or to use slurm
sbatch ./slurm-run.sh ”configs/code2/gnn-transformer/JK=cat/pooling=cls+norm_input.yml --runs 5”

The config path for each dataset/model can be found in the result table below.

Results

Dataset Model Valid Test Config
OGBG-Code2 GraphTrans (GCN) 0.1599±0.0009 0.1751±0.0015 Config
GraphTrans (PNA) 0.1622±0.0025 0.1765±0.0033 Config
GraphTrans (GCN-Virtual) 0.1661±0.0012 0.1830±0.0024 Config
OGBG-Molpcba GraphTrans (GIN) 0.2893±0.0050 0.2756±0.0039 Config
GraphTrans (GIN-Virtual) 0.2867±0.0022 0.2761±0.0029 Config
NCI1 GraphTrans (small, GCN) 81.3±1.9 Config
GraphTrans (large, GIN) 82.6±1.2 Config
NCI109 GraphTrans (small, GCN) 79.2±2.2 Config
GraphTrans (large, GIN) 82.3±2.6 Config
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].