All Projects → daiquocnguyen → Graph Transformer

daiquocnguyen / Graph Transformer

Licence: apache-2.0
Transformer for Graph Classification (Pytorch and Tensorflow)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Graph Transformer

Athena
an open-source implementation of sequence-to-sequence based speech processing engine
Stars: ✭ 542 (+183.77%)
Mutual labels:  unsupervised-learning, transformer
Fairseq Image Captioning
Transformer-based image captioning extension for pytorch/fairseq
Stars: ✭ 180 (-5.76%)
Mutual labels:  transformer
Medical Transformer
Pytorch Code for "Medical Transformer: Gated Axial-Attention for Medical Image Segmentation"
Stars: ✭ 153 (-19.9%)
Mutual labels:  transformer
Transformers.jl
Julia Implementation of Transformer models
Stars: ✭ 173 (-9.42%)
Mutual labels:  transformer
Tensorflowprojects
Deep learning using tensorflow
Stars: ✭ 167 (-12.57%)
Mutual labels:  unsupervised-learning
Opencog
A framework for integrated Artificial Intelligence & Artificial General Intelligence (AGI)
Stars: ✭ 2,132 (+1016.23%)
Mutual labels:  unsupervised-learning
Remixautoml
R package for automation of machine learning, forecasting, feature engineering, model evaluation, model interpretation, data generation, and recommenders.
Stars: ✭ 159 (-16.75%)
Mutual labels:  unsupervised-learning
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (-1.05%)
Mutual labels:  unsupervised-learning
Transformer Clinic
Understanding the Difficulty of Training Transformers
Stars: ✭ 179 (-6.28%)
Mutual labels:  transformer
End2end Asr Pytorch
End-to-End Automatic Speech Recognition on PyTorch
Stars: ✭ 175 (-8.38%)
Mutual labels:  transformer
Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+1324.08%)
Mutual labels:  unsupervised-learning
Effective transformer
Running BERT without Padding
Stars: ✭ 169 (-11.52%)
Mutual labels:  transformer
Tensorflow Ml Nlp
텐서플로우와 머신러닝으로 시작하는 자연어처리(로지스틱회귀부터 트랜스포머 챗봇까지)
Stars: ✭ 176 (-7.85%)
Mutual labels:  transformer
Danmf
A sparsity aware implementation of "Deep Autoencoder-like Nonnegative Matrix Factorization for Community Detection" (CIKM 2018).
Stars: ✭ 161 (-15.71%)
Mutual labels:  unsupervised-learning
Distancegan
Pytorch implementation of "One-Sided Unsupervised Domain Mapping" NIPS 2017
Stars: ✭ 180 (-5.76%)
Mutual labels:  unsupervised-learning
Dynamics
A Compositional Object-Based Approach to Learning Physical Dynamics
Stars: ✭ 159 (-16.75%)
Mutual labels:  unsupervised-learning
Gpt 2 Tensorflow2.0
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
Stars: ✭ 172 (-9.95%)
Mutual labels:  transformer
Factorvae
Pytorch implementation of FactorVAE proposed in Disentangling by Factorising(http://arxiv.org/abs/1802.05983)
Stars: ✭ 176 (-7.85%)
Mutual labels:  unsupervised-learning
Sentimentanalysis
Sentiment analysis neural network trained by fine-tuning BERT, ALBERT, or DistilBERT on the Stanford Sentiment Treebank.
Stars: ✭ 186 (-2.62%)
Mutual labels:  transformer
Homlr
Supplementary material for Hands-On Machine Learning with R, an applied book covering the fundamentals of machine learning with R.
Stars: ✭ 185 (-3.14%)
Mutual labels:  unsupervised-learning

Transformer for Graph ClassificationTwitter

GitHub top languageGitHub issues GitHub repo size GitHub last commit GitHub forks GitHub stars GitHub

This program provides the implementation of our U2GNN as described in our paper, where we leverage the transformer self-attention network to construct an advanced aggregation function to learn graph representations.

Usage

News

  • 17-05-2020: Update Pytorch (1.5.0) implementation.

Requirements

  • Python 3.x
  • Tensorflow 1.14
  • Tensor2tensor 1.13
  • Networkx 2.3
  • Scikit-learn 0.21.2

Training

U2GNN$ python train_U2GNN_Sup.py --dataset IMDBBINARY --batch_size 4 --ff_hidden_size 1024 --fold_idx 1 --num_neighbors 8 --num_sampled 512 --num_epochs 50 --num_timesteps 4 --learning_rate 0.0005 --model_name IMDBBINARY_bs4_fold1_dro05_1024_8_idx0_4_1

U2GNN$ python train_U2GNN_Sup.py --dataset PTC --batch_size 4 --ff_hidden_size 1024 --fold_idx 1 --num_neighbors 16 --num_sampled 512 --num_epochs 50 --num_timesteps 3 --learning_rate 0.0005 --model_name PTC_bs4_fold1_dro05_1024_16_idx0_3_1

Cite

Please cite the paper whenever U2GNN is used to produce published results or incorporated into other software:

@article{Nguyen2019U2GNN,
	author={Dai Quoc Nguyen and Tu Dinh Nguyen and Dinh Phung},
	title={{Universal Graph Transformer Self-Attention Networks}},
	journal={arXiv preprint arXiv:1909.11855},
	year={2019}
}

License

As a free open-source implementation, U2GNN is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. All other warranties including, but not limited to, merchantability and fitness for purpose, whether express, implied, or arising by operation of law, course of dealing, or trade usage are hereby disclaimed. I believe that the programs compute what I claim they compute, but I do not guarantee this. The programs may be poorly and inconsistently documented and may contain undocumented components, features or modifications. I make no guarantee that these programs will be suitable for any application.

U2GNN is licensed under the Apache License 2.0.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].