All Projects → yoonholee → Mt Net

yoonholee / Mt Net

Licence: mit
Code accompanying the ICML-2018 paper "Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Mt Net

resilient-swarm-communications-with-meta-graph-convolutional-networks
Meta graph convolutional neural network-assisted resilient swarm communications
Stars: ✭ 49 (+63.33%)
Mutual labels:  meta-learning
Awesome Papers Fewshot
Collection for Few-shot Learning
Stars: ✭ 466 (+1453.33%)
Mutual labels:  meta-learning
Learningtocompare fsl
PyTorch code for CVPR 2018 paper: Learning to Compare: Relation Network for Few-Shot Learning (Few-Shot Learning part)
Stars: ✭ 837 (+2690%)
Mutual labels:  meta-learning
e-osvos
Implementation of "Make One-Shot Video Object Segmentation Efficient Again” and the semi-supervised fine-tuning "e-OSVOS" approach (NeurIPS 2020).
Stars: ✭ 31 (+3.33%)
Mutual labels:  meta-learning
Meta Transfer Learning
TensorFlow and PyTorch implementation of "Meta-Transfer Learning for Few-Shot Learning" (CVPR2019)
Stars: ✭ 439 (+1363.33%)
Mutual labels:  meta-learning
Cfnet
[CVPR'17] Training a Correlation Filter end-to-end allows lightweight networks of 2 layers (600 kB) to high performance at fast speed..
Stars: ✭ 496 (+1553.33%)
Mutual labels:  meta-learning
dropclass speaker
DropClass and DropAdapt - repository for the paper accepted to Speaker Odyssey 2020
Stars: ✭ 20 (-33.33%)
Mutual labels:  meta-learning
Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+28170%)
Mutual labels:  meta-learning
Reinforcement learning tutorial with demo
Reinforcement Learning Tutorial with Demo: DP (Policy and Value Iteration), Monte Carlo, TD Learning (SARSA, QLearning), Function Approximation, Policy Gradient, DQN, Imitation, Meta Learning, Papers, Courses, etc..
Stars: ✭ 442 (+1373.33%)
Mutual labels:  meta-learning
Few Shot
Repository for few-shot learning machine learning projects
Stars: ✭ 727 (+2323.33%)
Mutual labels:  meta-learning
Matchingnetworks
This repo provides pytorch code which replicates the results of the Matching Networks for One Shot Learning paper on the Omniglot and MiniImageNet dataset
Stars: ✭ 256 (+753.33%)
Mutual labels:  meta-learning
Metaoptnet
Meta-Learning with Differentiable Convex Optimization (CVPR 2019 Oral)
Stars: ✭ 412 (+1273.33%)
Mutual labels:  meta-learning
Auto Sklearn
Automated Machine Learning with scikit-learn
Stars: ✭ 5,916 (+19620%)
Mutual labels:  meta-learning
Meta-SAC
Auto-tune the Entropy Temperature of Soft Actor-Critic via Metagradient - 7th ICML AutoML workshop 2020
Stars: ✭ 19 (-36.67%)
Mutual labels:  meta-learning
Hcn Prototypeloss Pytorch
Hierarchical Co-occurrence Network with Prototype Loss for Few-shot Learning (PyTorch)
Stars: ✭ 17 (-43.33%)
Mutual labels:  meta-learning
Meta-SelfLearning
Meta Self-learning for Multi-Source Domain Adaptation: A Benchmark
Stars: ✭ 157 (+423.33%)
Mutual labels:  meta-learning
Meta Dataset
A dataset of datasets for learning to learn from few examples
Stars: ✭ 483 (+1510%)
Mutual labels:  meta-learning
Mfe
Meta-Feature Extractor
Stars: ✭ 20 (-33.33%)
Mutual labels:  meta-learning
Looper
A resource list for causality in statistics, data science and physics
Stars: ✭ 23 (-23.33%)
Mutual labels:  meta-learning
Awesome Automl And Lightweight Models
A list of high-quality (newest) AutoML works and lightweight models including 1.) Neural Architecture Search, 2.) Lightweight Structures, 3.) Model Compression, Quantization and Acceleration, 4.) Hyperparameter Optimization, 5.) Automated Feature Engineering.
Stars: ✭ 691 (+2203.33%)
Mutual labels:  meta-learning

MT-net

Code accompanying the paper Gradient-Based Meta-Learning with Learned Layerwise Metric and Subspace (Yoonho Lee and Seungjin Choi, ICML 2018). It includes code for running the experiments in the paper (few-shot sine wave regression, Omniglot and miniImagenet few-shot classification).

Abstract

Gradient-based meta-learning methods leverage gradient descent to learn the commonalities among various tasks. While previous such methods have been successful in meta-learning tasks, they resort to simple gradient descent during meta-testing. Our primary contribution is the MT-net, which enables the meta-learner to learn on each layer's activation space a subspace that the task-specific learner performs gradient descent on. Additionally, a task-specific learner of an {\em MT-net} performs gradient descent with respect to a meta-learned distance metric, which warps the activation space to be more sensitive to task identity. We demonstrate that the dimension of this learned subspace reflects the complexity of the task-specific learner's adaptation task, and also that our model is less sensitive to the choice of initial learning rates than previous gradient-based meta-learning methods. Our method achieves state-of-the-art or comparable performance on few-shot classification and regression tasks.

Data

For the Omniglot and MiniImagenet data, see the usage instructions in data/omniglot_resized/resize_images.py and data/miniImagenet/proc_images.py respectively.

Usage

To run the code, see the usage instructions at the top of main.py.

For MT-nets, set use_T, use_M, share_M to True.

For T-nets, set use_T to True and use_M to False.

Reference

If you found the provided code useful, please cite our work.

@inproceedings{lee2018gradient,
  title={Gradient-based meta-learning with learned layerwise metric and subspace},
  author={Lee, Yoonho and Choi, Seungjin},
  booktitle={International Conference on Machine Learning},
  pages={2933--2942},
  year={2018}
}

This codebase is based on the repository for MAML.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].