All Projects → PetarV- → Gat

PetarV- / Gat

Licence: mit
Graph Attention Networks (https://arxiv.org/abs/1710.10903)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Gat

Pygat
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Stars: ✭ 1,853 (-16.87%)
Mutual labels:  neural-networks, attention-mechanism, graph-attention-networks, self-attention
Awesome Graph Classification
A collection of important graph embedding, classification and representation learning papers with implementations.
Stars: ✭ 4,309 (+93.32%)
Mutual labels:  attention-mechanism, graph-attention-networks
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (-97.44%)
Mutual labels:  attention-mechanism, self-attention
Awesome Bert Nlp
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Stars: ✭ 567 (-74.56%)
Mutual labels:  neural-networks, attention-mechanism
Deeplearning.ai Natural Language Processing Specialization
This repository contains my full work and notes on Coursera's NLP Specialization (Natural Language Processing) taught by the instructor Younes Bensouda Mourri and Łukasz Kaiser offered by deeplearning.ai
Stars: ✭ 473 (-78.78%)
Mutual labels:  neural-networks, attention-mechanism
Attend infer repeat
A Tensorfflow implementation of Attend, Infer, Repeat
Stars: ✭ 82 (-96.32%)
Mutual labels:  neural-networks, attention-mechanism
Cryptonets
CryptoNets is a demonstration of the use of Neural-Networks over data encrypted with Homomorphic Encryption. Homomorphic Encryptions allow performing operations such as addition and multiplication over data while it is encrypted. Therefore, it allows keeping data private while outsourcing computation (see here and here for more about Homomorphic Encryptions and its applications). This project demonstrates the use of Homomorphic Encryption for outsourcing neural-network predictions. The scenario in mind is a provider that would like to provide Prediction as a Service (PaaS) but the data for which predictions are needed may be private. This may be the case in fields such as health or finance. By using CryptoNets, the user of the service can encrypt their data using Homomorphic Encryption and send only the encrypted message to the service provider. Since Homomorphic Encryptions allow the provider to operate on the data while it is encrypted, the provider can make predictions using a pre-trained Neural-Network while the data remains encrypted throughout the process and finaly send the prediction to the user who can decrypt the results. During the process the service provider does not learn anything about the data that was used, the prediction that was made or any intermediate result since everything is encrypted throughout the process. This project uses the Simple Encrypted Arithmetic Library SEAL version 3.2.1 implementation of Homomorphic Encryption developed in Microsoft Research.
Stars: ✭ 152 (-93.18%)
Mutual labels:  neural-networks
Picanet Implementation
Pytorch Implementation of PiCANet: Learning Pixel-wise Contextual Attention for Saliency Detection
Stars: ✭ 157 (-92.96%)
Mutual labels:  attention-mechanism
Ml Workspace
🛠 All-in-one web-based IDE specialized for machine learning and data science.
Stars: ✭ 2,337 (+4.85%)
Mutual labels:  neural-networks
Gluon Ts
Probabilistic time series modeling in Python
Stars: ✭ 2,373 (+6.46%)
Mutual labels:  neural-networks
Poetry Seq2seq
Chinese Poetry Generation
Stars: ✭ 159 (-92.87%)
Mutual labels:  attention-mechanism
Deep Cfr
Scalable Implementation of Deep CFR and Single Deep CFR
Stars: ✭ 158 (-92.91%)
Mutual labels:  neural-networks
Ensemble Pytorch
A unified ensemble framework for Pytorch to improve the performance and robustness of your deep learning model
Stars: ✭ 153 (-93.14%)
Mutual labels:  neural-networks
Pan
[Params: Only 272K!!!] Efficient Image Super-Resolution Using Pixel Attention, in ECCV Workshop, 2020.
Stars: ✭ 151 (-93.23%)
Mutual labels:  attention-mechanism
Frvsr
Frame-Recurrent Video Super-Resolution (official repository)
Stars: ✭ 157 (-92.96%)
Mutual labels:  neural-networks
Hands On Machine Learning With Scikit Learn Keras And Tensorflow
Notes & exercise solutions of Part I from the book: "Hands-On ML with Scikit-Learn, Keras & TensorFlow: Concepts, Tools, and Techniques to Build Intelligent Systems" by Aurelien Geron
Stars: ✭ 151 (-93.23%)
Mutual labels:  neural-networks
Gan Mri
Code repository for Frontiers article 'Generative Adversarial Networks for Image-to-Image Translation on Multi-Contrast MR Images - A Comparison of CycleGAN and UNIT'
Stars: ✭ 159 (-92.87%)
Mutual labels:  neural-networks
Ai Blocks
A powerful and intuitive WYSIWYG interface that allows anyone to create Machine Learning models!
Stars: ✭ 1,818 (-18.44%)
Mutual labels:  neural-networks
Sinkhorn Transformer
Sinkhorn Transformer - Practical implementation of Sparse Sinkhorn Attention
Stars: ✭ 156 (-93%)
Mutual labels:  attention-mechanism
Avalanche
Avalanche: a End-to-End Library for Continual Learning.
Stars: ✭ 151 (-93.23%)
Mutual labels:  neural-networks

GAT

Graph Attention Networks (Veličković et al., ICLR 2018): https://arxiv.org/abs/1710.10903

GAT layer t-SNE + Attention coefficients on Cora

Overview

Here we provide the implementation of a Graph Attention Network (GAT) layer in TensorFlow, along with a minimal execution example (on the Cora dataset). The repository is organised as follows:

  • data/ contains the necessary dataset files for Cora;
  • models/ contains the implementation of the GAT network (gat.py);
  • pre_trained/ contains a pre-trained Cora model (achieving 84.4% accuracy on the test set);
  • utils/ contains:
    • an implementation of an attention head, along with an experimental sparse version (layers.py);
    • preprocessing subroutines (process.py);
    • preprocessing utilities for the PPI benchmark (process_ppi.py).

Finally, execute_cora.py puts all of the above together and may be used to execute a full training run on Cora.

Sparse version

An experimental sparse version is also available, working only when the batch size is equal to 1. The sparse model may be found at models/sp_gat.py.

You may execute a full training run of the sparse model on Cora through execute_cora_sparse.py.

Dependencies

The script has been tested running under Python 3.5.2, with the following packages installed (along with their dependencies):

  • numpy==1.14.1
  • scipy==1.0.0
  • networkx==2.1
  • tensorflow-gpu==1.6.0

In addition, CUDA 9.0 and cuDNN 7 have been used.

Reference

If you make advantage of the GAT model in your research, please cite the following in your manuscript:

@article{
  velickovic2018graph,
  title="{Graph Attention Networks}",
  author={Veli{\v{c}}kovi{\'{c}}, Petar and Cucurull, Guillem and Casanova, Arantxa and Romero, Adriana and Li{\`{o}}, Pietro and Bengio, Yoshua},
  journal={International Conference on Learning Representations},
  year={2018},
  url={https://openreview.net/forum?id=rJXMpikCZ},
  note={accepted as poster},
}

For getting started with GATs, as well as graph representation learning in general, we highly recommend the pytorch-GAT repository by Aleksa Gordić. It ships with an inductive (PPI) example as well.

GAT is a popular method for graph representation learning, with optimised implementations within virtually all standard GRL libraries:

We recommend using either one of those (depending on your favoured framework), as their implementations have been more readily battle-tested.

Early on post-release, two unofficial ports of the GAT model to various frameworks quickly surfaced. To honour the effort of their developers as early adopters of the GAT layer, we leave pointers to them here.

License

MIT

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].