All Projects → rootlu → L2p Gnn

rootlu / L2p Gnn

Codes and datasets for AAAI-2021 paper "Learning to Pre-train Graph Neural Networks"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to L2p Gnn

Multitask Learning
Awesome Multitask Learning Resources
Stars: ✭ 361 (+652.08%)
Mutual labels:  meta-learning
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+1339.58%)
Mutual labels:  meta-learning
Mfe
Meta-Feature Extractor
Stars: ✭ 20 (-58.33%)
Mutual labels:  meta-learning
Meta Transfer Learning
TensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)
Stars: ✭ 439 (+814.58%)
Mutual labels:  meta-learning
Cfnet
[CVPR'17] Training a Correlation Filter end-to-end allows lightweight networks of 2 layers (600 kB) to high performance at fast speed..
Stars: ✭ 496 (+933.33%)
Mutual labels:  meta-learning
Learningtocompare fsl
PyTorch code for CVPR 2018 paper: Learning to Compare: Relation Network for Few-Shot Learning (Few-Shot Learning part)
Stars: ✭ 837 (+1643.75%)
Mutual labels:  meta-learning
e-osvos
Implementation of "Make One-Shot Video Object Segmentation Efficient Again” and the semi-supervised fine-tuning "e-OSVOS" approach (NeurIPS 2020).
Stars: ✭ 31 (-35.42%)
Mutual labels:  meta-learning
Learning To Learn By Pytorch
"Learning to learn by gradient descent by gradient descent "by PyTorch -- a simple re-implementation.
Stars: ✭ 31 (-35.42%)
Mutual labels:  meta-learning
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+12225%)
Mutual labels:  meta-learning
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+17568.75%)
Mutual labels:  meta-learning
Reinforcement learning tutorial with demo
Reinforcement Learning Tutorial with Demo: DP (Policy and Value Iteration), Monte Carlo, TD Learning (SARSA, QLearning), Function Approximation, Policy Gradient, DQN, Imitation, Meta Learning, Papers, Courses, etc..
Stars: ✭ 442 (+820.83%)
Mutual labels:  meta-learning
Meta Dataset
A dataset of datasets for learning to learn from few examples
Stars: ✭ 483 (+906.25%)
Mutual labels:  meta-learning
Hcn Prototypeloss Pytorch
Hierarchical Co-occurrence Network with Prototype Loss for Few-shot Learning (PyTorch)
Stars: ✭ 17 (-64.58%)
Mutual labels:  meta-learning
Metaoptnet
Meta-Learning with Differentiable Convex Optimization (CVPR 2019 Oral)
Stars: ✭ 412 (+758.33%)
Mutual labels:  meta-learning
Mt Net
Code accompanying the ICML-2018 paper "Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace"
Stars: ✭ 30 (-37.5%)
Mutual labels:  meta-learning
Matchingnetworks
This repo provides pytorch code which replicates the results of the Matching Networks for One Shot Learning paper on the Omniglot and MiniImageNet dataset
Stars: ✭ 256 (+433.33%)
Mutual labels:  meta-learning
Few Shot
Repository for few-shot learning machine learning projects
Stars: ✭ 727 (+1414.58%)
Mutual labels:  meta-learning
Maml Tf
Tensorflow Implementation of MAML
Stars: ✭ 44 (-8.33%)
Mutual labels:  meta-learning
Few Shot Text Classification
Few-shot binary text classification with Induction Networks and Word2Vec weights initialization
Stars: ✭ 32 (-33.33%)
Mutual labels:  meta-learning
Looper
A resource list for causality in statistics, data science and physics
Stars: ✭ 23 (-52.08%)
Mutual labels:  meta-learning

Learning to Pre-train Graph Neural Networks

This repository is the official implementation of AAAI-2021 paper Learning to Pre-train Graph Neural Networks

Requirements

To install requirements:

pip install -r requirements.txt

Dataset

All the necessary data files can be downloaded from the following links.

For Biology dataset, download from Google Drive and BaiduYun (Extraction code: j97n), unzip it, and put the under data/bio/.

The new compilation of bibliographic graphs, i.e., PreDBLP, download from Google Drive and BaiduYun (Extraction code: j97n), unzip it, and move the dblp.graph file to data/dblp/unsupervised/processed/ and the dblpfinetune.graph file to data/dblp/supervised/processed/, respectively.

Also, to avoid the "file incomplete" errors caused by compressed files, we also upload the uncompressed dblp dataset at BaiduYun (Extraction code: j97n).

Training

To pre-train L2P-GNN on Biology dataset w.r.t. GIN model, run this command:

python main.py --dataset DATASET  --gnn_type GNN_MODEL --model_file PRE_TRAINED_MODEL_NAME --device 1

The pre-trained models are saved into res/DATASET/ .

Evaluation

To fine-tune L2P-GNN on Biology dataset, run:

python eval_bio.py --dataset DATASET  --gnn_type GNN_MODEL --emb_trained_model_file EMB_TRAINED_FILE --pre_trained_model_file GNN_TRAINED_FILE --pool_trained_model_file POOL_TRAINED_FILE --result_file RESULT_FILE --device 1

The results w.r.t 10 random running seeds are saved into res/DATASET/finetune_seed(0-9)/

Results

To analysis results of downstream tasks, run:

python result_analysis.py  --dataset DATASET --times SEED_NUM

where SEED_NUM is the number of random seed ranging from 0 to 9, thus it is usually set to 10.

Reproducing results in the paper

Our results in the paper can be reproduced by directly running:

python eval_bio.py --dataset bio --gnn_type gin --emb_trained_model_file co_adaptation_5_300_gin_50_emb.pth --pre_trained_model_file co_adaptation_5_300_gin_50_gnn.pth --pool_trained_model_file co_adaptation_5_300_gin_50_pool.pth --result_file co_adaptation_5_300_gin_50 --device 0

and

python eval_dblp.py --dataset dblp --gnn_type gin --split random --emb_trained_model_file co_adaptation_5_300_s50q30_gin_20_emb.pth --pre_trained_model_file co_adaptation_5_300_s50q30_gin_20_gnn.pth --pool_trained_model_file co_adaptation_5_300_s50q30_gin_20_pool.pth --result_file co_adaptation_5_300_s50q30_gin_20  --device 0 --dropout_ratio 0.1
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].