All Projects → lorenmt → Maxl

lorenmt / Maxl

Licence: mit
The implementation of "Self-Supervised Generalisation with Meta Auxiliary Learning" [NeurIPS 2019].

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Maxl

Learningtocompare fsl
PyTorch code for CVPR 2018 paper: Learning to Compare: Relation Network for Few-Shot Learning (Few-Shot Learning part)
Stars: ✭ 837 (+728.71%)
Mutual labels:  meta-learning
Maml Tf
Tensorflow Implementation of MAML
Stars: ✭ 44 (-56.44%)
Mutual labels:  meta-learning
Memory Efficient Maml
Memory efficient MAML using gradient checkpointing
Stars: ✭ 60 (-40.59%)
Mutual labels:  meta-learning
Looper
A resource list for causality in statistics, data science and physics
Stars: ✭ 23 (-77.23%)
Mutual labels:  meta-learning
Few Shot Text Classification
Few-shot binary text classification with Induction Networks and Word2Vec weights initialization
Stars: ✭ 32 (-68.32%)
Mutual labels:  meta-learning
Multidigitmnist
Combine multiple MNIST digits to create datasets with 100/1000 classes for few-shot learning/meta-learning
Stars: ✭ 48 (-52.48%)
Mutual labels:  meta-learning
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+584.16%)
Mutual labels:  meta-learning
R2d2
[ICLR'19] Meta-learning with differentiable closed-form solvers
Stars: ✭ 96 (-4.95%)
Mutual labels:  meta-learning
Learning To Learn By Pytorch
"Learning to learn by gradient descent by gradient descent "by PyTorch -- a simple re-implementation.
Stars: ✭ 31 (-69.31%)
Mutual labels:  meta-learning
Neural Process Family
Code for the Neural Processes website and replication of 4 papers on NPs. Pytorch implementation.
Stars: ✭ 53 (-47.52%)
Mutual labels:  meta-learning
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+8297.03%)
Mutual labels:  meta-learning
Mt Net
Code accompanying the ICML-2018 paper "Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace"
Stars: ✭ 30 (-70.3%)
Mutual labels:  meta-learning
G Meta
Graph meta learning via local subgraphs (NeurIPS 2020)
Stars: ✭ 50 (-50.5%)
Mutual labels:  meta-learning
Hcn Prototypeloss Pytorch
Hierarchical Co-occurrence Network with Prototype Loss for Few-shot Learning (PyTorch)
Stars: ✭ 17 (-83.17%)
Mutual labels:  meta-learning
Learn2learn
A PyTorch Library for Meta-learning Research
Stars: ✭ 1,193 (+1081.19%)
Mutual labels:  meta-learning
Few Shot
Repository for few-shot learning machine learning projects
Stars: ✭ 727 (+619.8%)
Mutual labels:  meta-learning
L2p Gnn
Codes and datasets for AAAI-2021 paper "Learning to Pre-train Graph Neural Networks"
Stars: ✭ 48 (-52.48%)
Mutual labels:  meta-learning
Gnn Meta Attack
Implementation of the paper "Adversarial Attacks on Graph Neural Networks via Meta Learning".
Stars: ✭ 99 (-1.98%)
Mutual labels:  meta-learning
Pytorch Meta
A collection of extensions and data-loaders for few-shot learning & meta-learning in PyTorch
Stars: ✭ 1,239 (+1126.73%)
Mutual labels:  meta-learning
Meta Learning Bert
Meta learning with BERT as a learner
Stars: ✭ 52 (-48.51%)
Mutual labels:  meta-learning

Meta Auxiliary Learning

This repository contains the source code to support the paper: Self-Supervised Generalisation with Meta Auxiliary Learning, introduced by Shikun Liu, Andrew J. Davison and Edward Johns.

Requirements

MAXL was written in python 3.7 and pytorch 1.0. We recommend running the code through the same version while we believe the code should also work (or can be easily revised) within other versions.

Models & Datasets

This repository includes three models model_vgg_single.py, model_vgg_human.py and model_vgg_maxl.py representing baselines Single, Human and our proposed algorithm MAXL with backbone architecture VGG-16. These three models are trained with 4-level CIFAR-100 dataset which should easily reproduce part of the results in Figure 3.

In create_dataset.py, we define an extended version of CIFAR-100 with 4-level hierarchy built on the original CIFAR100 class in torchvision.datasets (see the full table for semantic classes in Appendix A). To fetch one batch of input data with kth hierarchical labels as defined below, we have train_data which represents the input images and train_label which represents the 4-level hierarchical labels: train_label[:, k], k = 0, 1, 2, 3 fetches 3, 10, 20 and 100-classes respectively.

train_data, train_label[:, k] = cifar100_train_dataset.next()

Training MAXL

The source code provided gives an example of training primary task of 20 classes train_label[:, 2] and auxiliary task of 100 classes train_label[:, 3] with hierarchical structure \psi[i]=5. To run the code, please create a folder dataset to download CIFAR-100 dataset in this directory or you may redefine the dataset root path as your wish. It is straightforward to revise the code evaluating other hierarchies and play with other datasets found in torchvision.datasets.

Note that: make sure len(psi) be the number of primary classes, and sum(psi) be the number of total auxiliary classes, e.g. psi = [2,3,4] representing total 3 primary classes and total 9 auxiliary classes by splitting each corresponding primary class into 2, 3, and 4 different auxiliary classes.

Training MAXL from scratch typically requires 30 hours in GTX 1080, and training the baselines methods Single and Human requires 2-4 hours from scratch.

Citation

If you found this code/work to be useful in your own research, please considering citing the following:

@inproceedings{liu2019maxl,
  title={Self-supervised generalisation with meta auxiliary learning},
  author={Liu, Shikun and Davison, Andrew and Johns, Edward},
  booktitle={Advances in Neural Information Processing Systems},
  pages={1677--1687},
  year={2019}
}

Contact

If you have any questions, please contact [email protected].

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].