All Projects → tech-srl → how_attentive_are_gats

tech-srl / how_attentive_are_gats

Licence: other
Code for the paper "How Attentive are Graph Attention Networks?" (ICLR'2022)

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to how attentive are gats

mtad-gat-pytorch
PyTorch implementation of MTAD-GAT (Multivariate Time-Series Anomaly Detection via Graph Attention Networks) by Zhao et. al (2020, https://arxiv.org/abs/2009.02040).
Stars: ✭ 85 (-57.5%)
Mutual labels:  attention, graph-attention-networks, graph-neural-networks
PyNets
A Reproducible Workflow for Structural and Functional Connectome Ensemble Learning
Stars: ✭ 114 (-43%)
Mutual labels:  networks, graph-neural-networks
SelfGNN
A PyTorch implementation of "SelfGNN: Self-supervised Graph Neural Networks without explicit negative sampling" paper, which appeared in The International Workshop on Self-Supervised Learning for the Web (SSL'21) @ the Web Conference 2021 (WWW'21).
Stars: ✭ 24 (-88%)
Mutual labels:  graph-attention-networks, graph-neural-networks
graphml-tutorials
Tutorials for Machine Learning on Graphs
Stars: ✭ 125 (-37.5%)
Mutual labels:  networks, graph-neural-networks
gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional Representations), ICLR 2022
Stars: ✭ 165 (-17.5%)
Mutual labels:  attention, graph-neural-networks
AC-VRNN
PyTorch code for CVIU paper "AC-VRNN: Attentive Conditional-VRNN for Multi-Future Trajectory Prediction"
Stars: ✭ 21 (-89.5%)
Mutual labels:  graph-attention-networks, graph-neural-networks
CoVA-Web-Object-Detection
A Context-aware Visual Attention-based training pipeline for Object Detection from a Webpage screenshot!
Stars: ✭ 18 (-91%)
Mutual labels:  attention, graph-attention-networks
SBR
⌛ Introducing Self-Attention to Target Attentive Graph Neural Networks (AISP '22)
Stars: ✭ 22 (-89%)
Mutual labels:  attention, graph-neural-networks
Appnp
A PyTorch implementation of "Predict then Propagate: Graph Neural Networks meet Personalized PageRank" (ICLR 2019).
Stars: ✭ 234 (+17%)
Mutual labels:  attention
LR-GCCF
Revisiting Graph based Collaborative Filtering: A Linear Residual Graph Convolutional Network Approach, AAAI2020
Stars: ✭ 99 (-50.5%)
Mutual labels:  graph-neural-networks
Gam
A PyTorch implementation of "Graph Classification Using Structural Attention" (KDD 2018).
Stars: ✭ 227 (+13.5%)
Mutual labels:  attention
Jddc solution 4th
2018-JDDC大赛第4名的解决方案
Stars: ✭ 235 (+17.5%)
Mutual labels:  attention
MGAN
Exploiting Coarse-to-Fine Task Transfer for Aspect-level Sentiment Classification (AAAI'19)
Stars: ✭ 44 (-78%)
Mutual labels:  attention
Self Attention Cv
Implementation of various self-attention mechanisms focused on computer vision. Ongoing repository.
Stars: ✭ 209 (+4.5%)
Mutual labels:  attention
disentangled graph collaborative filtering
Disentagnled Graph Collaborative Filtering, SIGIR2020
Stars: ✭ 118 (-41%)
Mutual labels:  graph-neural-networks
Neat Vision
Neat (Neural Attention) Vision, is a visualization tool for the attention mechanisms of deep-learning models for Natural Language Processing (NLP) tasks. (framework-agnostic)
Stars: ✭ 213 (+6.5%)
Mutual labels:  attention
Pen Net For Inpainting
[CVPR'2019]PEN-Net: Learning Pyramid-Context Encoder Network for High-Quality Image Inpainting
Stars: ✭ 206 (+3%)
Mutual labels:  attention
dreyeve
[TPAMI 2018] Predicting the Driver’s Focus of Attention: the DR(eye)VE Project. A deep neural network learnt to reproduce the human driver focus of attention (FoA) in a variety of real-world driving scenarios.
Stars: ✭ 88 (-56%)
Mutual labels:  attention
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (-79.5%)
Mutual labels:  attention
Cgnl Network.pytorch
Compact Generalized Non-local Network (NIPS 2018)
Stars: ✭ 252 (+26%)
Mutual labels:  attention

How Attentive are Graph Attention Networks?

This repository is the official implementation of How Attentive are Graph Attention Networks?.

January 2022: the paper was accepted to ICLR'2022 !

alt text

Using GATv2

GATv2 is now available as part of PyTorch Geometric library!

from torch_geometric.nn.conv.gatv2_conv import GATv2Conv

https://pytorch-geometric.readthedocs.io/en/latest/modules/nn.html#torch_geometric.nn.conv.GATv2Conv

and also is in this main directory.

GATv2 is now available as part of DGL library!

from dgl.nn.pytorch import GATv2Conv

https://docs.dgl.ai/en/latest/api/python/nn.pytorch.html#gatv2conv

and also in this repository.

GATv2 is now available as part of Google's TensorFlow GNN library!

from tensorflow_gnn.graph.keras.layers.gat_v2 import GATv2Convolution

https://github.com/tensorflow/gnn/blob/main/tensorflow_gnn/docs/api_docs/python/gnn/keras/layers/GATv2.md

Code Structure

Since our experiments (Section 4) are based on different frameworks, this repository is divided into several sub-projects:

  1. The subdirectory arxiv_mag_products_collab_citation2_noise contains the needed files to reproduce the results of Node-Prediction, Link-Prediction, and Robustness to Noise (Tables 2a, 3 and Figure 4).
  2. The subdirectory proteins contains the needed files to reproduce the results of ogbn-proteins in Node-Prediction (Table 2b).
  3. The subdirectory dictionary_lookup contains the need files to reproduce the results of the DictionaryLookup benchmark (Figure 3).
  4. The subdirectory tf-gnn-samples contains the needed files to reproduce the results of the VarMisuse and QM9 datasets (Table 1 and Table 4).

Requirements

Each subdirectory contains its own requirements and dependencies.

Generally, all subdirectories depend on PyTorch 1.7.1 and PyTorch Geometric version 1.7.0 (proteins depends on DGL version 0.6.0). The subdirectory tf-gnn-samples (VarMisuse and QM9) depends on TensorFlow 1.13.

Hardware

In general, all experiments can run on either GPU or CPU.

Citation

How Attentive are Graph Attention Networks?

@inproceedings{
  brody2022how,
  title={How Attentive are Graph Attention Networks? },
  author={Shaked Brody and Uri Alon and Eran Yahav},
  booktitle={International Conference on Learning Representations},
  year={2022},
  url={https://openreview.net/forum?id=F72ximsx7C1}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].