All Projects → Diego999 → Pygat

Diego999 / Pygat

Licence: mit
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Pygat

Gat
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Stars: ✭ 2,229 (+20.29%)
Mutual labels:  neural-networks, attention-mechanism, graph-attention-networks, self-attention
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-96.92%)
Mutual labels:  attention-mechanism, self-attention
Deeplearning.ai Natural Language Processing Specialization
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai
Stars: ✭ 473 (-74.47%)
Mutual labels:  neural-networks, attention-mechanism
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (-69.4%)
Mutual labels:  neural-networks, attention-mechanism
Awesome Graph Classification
A collection of important graph embedding, classification and representation learning papers with implementations.
Stars: ✭ 4,309 (+132.54%)
Mutual labels:  attention-mechanism, graph-attention-networks
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-95.57%)
Mutual labels:  neural-networks, attention-mechanism
Torchmd
End-To-End Molecular Dynamics (MD) Engine using PyTorch
Stars: ✭ 99 (-94.66%)
Mutual labels:  neural-networks
Adcme.jl
Automatic Differentiation Library for Computational and Mathematical Engineering
Stars: ✭ 106 (-94.28%)
Mutual labels:  neural-networks
Pytorchnlpbook
Code and data accompanying Natural Language Processing with PyTorch published by O'Reilly Media https://nlproc.info
Stars: ✭ 1,390 (-24.99%)
Mutual labels:  neural-networks
Smrt
Handle class imbalance intelligently by using variational auto-encoders to generate synthetic observations of your minority class.
Stars: ✭ 102 (-94.5%)
Mutual labels:  neural-networks
Xaynet
Xaynet represents an agnostic Federated Machine Learning framework to build privacy-preserving AI applications.
Stars: ✭ 111 (-94.01%)
Mutual labels:  neural-networks
Elephas
Distributed Deep learning with Keras & Spark
Stars: ✭ 1,521 (-17.92%)
Mutual labels:  neural-networks
Fast Style Transfer
TensorFlow CNN for fast style transfer ⚡🖥🎨🖼
Stars: ✭ 10,240 (+452.62%)
Mutual labels:  neural-networks
Reformer Pytorch
Reformer, the efficient Transformer, in Pytorch
Stars: ✭ 1,644 (-11.28%)
Mutual labels:  attention-mechanism
Ylg
[CVPR 2020] Official Implementation: "Your Local GAN: Designing Two Dimensional Local Attention Mechanisms for Generative Models".
Stars: ✭ 109 (-94.12%)
Mutual labels:  attention-mechanism
Spokestack Python
Spokestack is a library that allows a user to easily incorporate a voice interface into any Python application.
Stars: ✭ 103 (-94.44%)
Mutual labels:  neural-networks
Overlappredator
[CVPR 2021, Oral] PREDATOR: Registration of 3D Point Clouds with Low Overlap.
Stars: ✭ 106 (-94.28%)
Mutual labels:  attention-mechanism
Sigmoidal ai
Tutoriais de Python, Data Science, Machine Learning e Deep Learning - Sigmoidal
Stars: ✭ 103 (-94.44%)
Mutual labels:  neural-networks
Numpy Ml
Machine learning, in numpy
Stars: ✭ 11,100 (+499.03%)
Mutual labels:  neural-networks
Stanet
official implementation of the spatial-temporal attention neural network (STANet) for remote sensing image change detection
Stars: ✭ 109 (-94.12%)
Mutual labels:  attention-mechanism

Pytorch Graph Attention Network

This is a pytorch implementation of the Graph Attention Network (GAT) model presented by Veličković et. al (2017, https://arxiv.org/abs/1710.10903).

The repo has been forked initially from https://github.com/tkipf/pygcn. The official repository for the GAT (Tensorflow) is available in https://github.com/PetarV-/GAT. Therefore, if you make advantage of the pyGAT model in your research, please cite the following:

@article{
  velickovic2018graph,
  title="{Graph Attention Networks}",
  author={Veli{\v{c}}kovi{\'{c}}, Petar and Cucurull, Guillem and Casanova, Arantxa and Romero, Adriana and Li{\`{o}}, Pietro and Bengio, Yoshua},
  journal={International Conference on Learning Representations},
  year={2018},
  url={https://openreview.net/forum?id=rJXMpikCZ},
  note={accepted as poster},
}

The branch master contains the implementation from the paper. The branch similar_impl_tensorflow the implementation from the official Tensorflow repository.

Performances

For the branch master, the training of the transductive learning on Cora task on a Titan Xp takes ~0.9 sec per epoch and 10-15 minutes for the whole training (~800 epochs). The final accuracy is between 84.2 and 85.3 (obtained on 5 different runs). For the branch similar_impl_tensorflow, the training takes less than 1 minute and reach ~83.0.

A small note about initial sparse matrix operations of https://github.com/tkipf/pygcn: they have been removed. Therefore, the current model take ~7GB on GRAM.

Sparse version GAT

We develop a sparse version GAT using pytorch. There are numerically instability because of softmax function. Therefore, you need to initialize carefully. To use sparse version GAT, add flag --sparse. The performance of sparse version is similar with tensorflow. On a Titan Xp takes 0.08~0.14 sec.

Requirements

pyGAT relies on Python 3.5 and PyTorch 0.4.1 (due to torch.sparse_coo_tensor).

Issues/Pull Requests/Feedbacks

Don't hesitate to contact for any feedback or create issues/pull requests.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].