All Projects → daiquocnguyen → Walk-Transformer

daiquocnguyen / Walk-Transformer

Licence: Apache-2.0 license
From Random Walks to Transformer for Learning Node Embeddings (ECML-PKDD 2020) (In Pytorch and Tensorflow)

Programming Languages

python
139335 projects - #7 most used programming language
C++
36643 projects - #6 most used programming language
Makefile
30231 projects

Projects that are alternatives of or similar to Walk-Transformer

QGNN
Quaternion Graph Neural Networks (ACML 2021) (Pytorch and Tensorflow)
Stars: ✭ 31 (+19.23%)
Mutual labels:  graph-embeddings, node-classification, graph-neural-networks, node-embeddings
kaggle-champs
Code for the CHAMPS Predicting Molecular Properties Kaggle competition
Stars: ✭ 49 (+88.46%)
Mutual labels:  transformer, graph-neural-networks
R-MeN
Transformer-based Memory Networks for Knowledge Graph Embeddings (ACL 2020) (Pytorch and Tensorflow)
Stars: ✭ 74 (+184.62%)
Mutual labels:  transformer, self-attention
TitleStylist
Source code for our "TitleStylist" paper at ACL 2020
Stars: ✭ 72 (+176.92%)
Mutual labels:  transformer, pytorch-implementation
query-selector
LONG-TERM SERIES FORECASTING WITH QUERYSELECTOR – EFFICIENT MODEL OF SPARSEATTENTION
Stars: ✭ 63 (+142.31%)
Mutual labels:  transformer, self-attention
MASTER-pytorch
Code for the paper "MASTER: Multi-Aspect Non-local Network for Scene Text Recognition" (Pattern Recognition 2021)
Stars: ✭ 263 (+911.54%)
Mutual labels:  transformer, self-attention
EgoCNN
Code for "Distributed, Egocentric Representations of Graphs for Detecting Critical Structures" (ICML 2019)
Stars: ✭ 16 (-38.46%)
Mutual labels:  graph-embeddings, graph-neural-networks
well-classified-examples-are-underestimated
Code for the AAAI 2022 publication "Well-classified Examples are Underestimated in Classification with Deep Neural Networks"
Stars: ✭ 21 (-19.23%)
Mutual labels:  transformer, graph-neural-networks
CrabNet
Predict materials properties using only the composition information!
Stars: ✭ 57 (+119.23%)
Mutual labels:  transformer, self-attention
AdaSpeech
AdaSpeech: Adaptive Text to Speech for Custom Voice
Stars: ✭ 108 (+315.38%)
Mutual labels:  transformer, pytorch-implementation
seq2seq-pytorch
Sequence to Sequence Models in PyTorch
Stars: ✭ 41 (+57.69%)
Mutual labels:  transformer, self-attention
graphtrans
Representing Long-Range Context for Graph Neural Networks with Global Attention
Stars: ✭ 45 (+73.08%)
Mutual labels:  transformer, graph-neural-networks
VT-UNet
[MICCAI2022] This is an official PyTorch implementation for A Robust Volumetric Transformer for Accurate 3D Tumor Segmentation
Stars: ✭ 151 (+480.77%)
Mutual labels:  transformer, pytorch-implementation
Representation-Learning-for-Information-Extraction
Pytorch implementation of Paper by Google Research - Representation Learning for Information Extraction from Form-like Documents.
Stars: ✭ 82 (+215.38%)
Mutual labels:  transformer, pytorch-implementation
Pytorch Seq2seq
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
Stars: ✭ 3,418 (+13046.15%)
Mutual labels:  transformer, pytorch-implementation
Generative MLZSL
[TPAMI Under Submission] Generative Multi-Label Zero-Shot Learning
Stars: ✭ 37 (+42.31%)
Mutual labels:  self-attention, pytorch-implementation
Euler
A distributed graph deep learning framework.
Stars: ✭ 2,701 (+10288.46%)
Mutual labels:  random-walk, graph-neural-networks
ClusterTransformer
Topic clustering library built on Transformer embeddings and cosine similarity metrics.Compatible with all BERT base transformers from huggingface.
Stars: ✭ 36 (+38.46%)
Mutual labels:  transformer, pytorch-implementation
PDN
The official PyTorch implementation of "Pathfinder Discovery Networks for Neural Message Passing" (WebConf '21)
Stars: ✭ 44 (+69.23%)
Mutual labels:  transformer, node-classification
walklets
A lightweight implementation of Walklets from "Don't Walk Skip! Online Learning of Multi-scale Network Embeddings" (ASONAM 2017).
Stars: ✭ 94 (+261.54%)
Mutual labels:  node-classification, graph-neural-networks

From Random Walks to Transformer for Learning Node EmbeddingsTwitter

GitHub top languageGitHub issues GitHub repo size GitHub last commit GitHub forks GitHub stars GitHub

  • This program provides the implementation of our unsupervised node embedding model SANNE as described in our paper whose central idea is to employ a transformer self-attention network to iteratively aggregate vector representations of nodes in sampled random walks.
  • SANNE is also used in an inductive setting to infer embeddings of new/unseen nodes adding to a given graph.

Usage

News

  • June 08, 2020: Update Pytorch (1.5.0) implementation. You should change to the log_uniform directory to perform make to build SampledSoftmax, and then add the log_uniform directory to your PYTHONPATH.
  • March 25, 2020: The Tensorflow implementation was completed one year ago and now it is out-of-date, caused by the change of Tensorflow from 1.x to 2.x. I will release the Pytorch implementation soon.

Requirements

  • Python 3
  • Pytorch 1.5 or
  • Tensorflow 1.6 and Tensor2Tensor 1.9
  • scikit-learn 0.21.3

Training

Examples for the Pytorch implementation:

$ python train_pytorch_SANNE.py --dataset cora --batch_size 64 --num_self_att_layers 2 --num_heads 2 --ff_hidden_size 256 --num_neighbors 4 --walk_length 8 --num_walks 32 --learning_rate 0.005 --model_name CORA_trans_att2_h2_nw32_lr0.005

$ python train_pytorch_SANNE_inductive.py --dataset cora --batch_size 64 --num_self_att_layers 2 --num_heads 2 --ff_hidden_size 256 --num_neighbors 4 --walk_length 8 --num_walks 32 --fold_idx 1 --learning_rate 0.005 --model_name CORA_ind_att2_h2_nw32_fold1_lr0.005

Cite

Please cite the paper whenever SANNE is used to produce published results or incorporated into other software:

@InProceedings{Nguyen2020SANNE,
	author={Dai Quoc Nguyen and Tu Dinh Nguyen and Dinh Phung},
	title={A Self-Attention Network based Node Embedding Model},
	booktitle={Proceedings of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD)},
	year={2020}
}

License

As a free open-source implementation, SANNE is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. All other warranties including, but not limited to, merchantability and fitness for purpose, whether express, implied, or arising by operation of law, course of dealing, or trade usage are hereby disclaimed. I believe that the programs compute what I claim they compute, but I do not guarantee this. The programs may be poorly and inconsistently documented and may contain undocumented components, features or modifications. I make no guarantee that these programs will be suitable for any application.

SANNE is licensed under the Apache License 2.0.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].