All Projects → alessiabertugli → FUSION

alessiabertugli / FUSION

Licence: Apache-2.0 license
PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to FUSION

Transferlearning
Transfer learning / domain adaptation / domain generalization / multi-task learning etc. Papers, codes, datasets, applications, tutorials.-迁移学习
Stars: ✭ 8,481 (+47016.67%)
Mutual labels:  representation-learning, unsupervised-learning, meta-learning, few-shot-learning
Variational Ladder Autoencoder
Implementation of VLAE
Stars: ✭ 196 (+988.89%)
Mutual labels:  representation-learning, unsupervised-learning
Simclr
SimCLRv2 - Big Self-Supervised Models are Strong Semi-Supervised Learners
Stars: ✭ 2,720 (+15011.11%)
Mutual labels:  representation-learning, unsupervised-learning
Generative Continual Learning
No description or website provided.
Stars: ✭ 51 (+183.33%)
Mutual labels:  incremental-learning, continual-learning
Pointglr
Global-Local Bidirectional Reasoning for Unsupervised Representation Learning of 3D Point Clouds (CVPR 2020)
Stars: ✭ 86 (+377.78%)
Mutual labels:  representation-learning, unsupervised-learning
Autoregressive Predictive Coding
Autoregressive Predictive Coding: An unsupervised autoregressive model for speech representation learning
Stars: ✭ 138 (+666.67%)
Mutual labels:  representation-learning, unsupervised-learning
Contrastive Predictive Coding Pytorch
Contrastive Predictive Coding for Automatic Speaker Verification
Stars: ✭ 223 (+1138.89%)
Mutual labels:  representation-learning, unsupervised-learning
Unsupervised Classification
SCAN: Learning to Classify Images without Labels (ECCV 2020), incl. SimCLR.
Stars: ✭ 605 (+3261.11%)
Mutual labels:  representation-learning, unsupervised-learning
LibFewShot
LibFewShot: A Comprehensive Library for Few-shot Learning.
Stars: ✭ 629 (+3394.44%)
Mutual labels:  meta-learning, few-shot-learning
Awesome-Few-shot
Awesome Few-shot learning
Stars: ✭ 50 (+177.78%)
Mutual labels:  meta-learning, few-shot-learning
simple-cnaps
Source codes for "Improved Few-Shot Visual Classification" (CVPR 2020), "Enhancing Few-Shot Image Classification with Unlabelled Examples" (WACV 2022), and "Beyond Simple Meta-Learning: Multi-Purpose Models for Multi-Domain, Active and Continual Few-Shot Learning" (Neural Networks 2022 - in submission)
Stars: ✭ 88 (+388.89%)
Mutual labels:  meta-learning, few-shot-learning
Self Supervised Learning Overview
📜 Self-Supervised Learning from Images: Up-to-date reading list.
Stars: ✭ 73 (+305.56%)
Mutual labels:  representation-learning, unsupervised-learning
Bagofconcepts
Python implementation of bag-of-concepts
Stars: ✭ 18 (+0%)
Mutual labels:  representation-learning, unsupervised-learning
VQ-APC
Vector Quantized Autoregressive Predictive Coding (VQ-APC)
Stars: ✭ 34 (+88.89%)
Mutual labels:  representation-learning, unsupervised-learning
Simclr
PyTorch implementation of SimCLR: A Simple Framework for Contrastive Learning of Visual Representations
Stars: ✭ 750 (+4066.67%)
Mutual labels:  representation-learning, unsupervised-learning
Pytorch Byol
PyTorch implementation of Bootstrap Your Own Latent: A New Approach to Self-Supervised Learning
Stars: ✭ 213 (+1083.33%)
Mutual labels:  representation-learning, unsupervised-learning
Revisiting-Contrastive-SSL
Revisiting Contrastive Methods for Unsupervised Learning of Visual Representations. [NeurIPS 2021]
Stars: ✭ 81 (+350%)
Mutual labels:  representation-learning, unsupervised-learning
Awesome Vaes
A curated list of awesome work on VAEs, disentanglement, representation learning, and generative models.
Stars: ✭ 418 (+2222.22%)
Mutual labels:  representation-learning, unsupervised-learning
Lemniscate.pytorch
Unsupervised Feature Learning via Non-parametric Instance Discrimination
Stars: ✭ 532 (+2855.56%)
Mutual labels:  representation-learning, unsupervised-learning
FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
Stars: ✭ 411 (+2183.33%)
Mutual labels:  incremental-learning, continual-learning

Few-Shot Unsupervised Continual Learning through Meta-Examples

This repository contains the PyTorch code for NeurIPS 2020, 4th Workshop on Meta-Learning paper:

Few-Shot Unsupervised Continual Learning through Meta-Examples
Alessia Bertugli, Stefano Vincenzi, Simone Calderara, Andrea Passerini

Model architecture

FUSION-ME - overview

Scheme of FUSION-ME. The model is composed of 4 phases: embedding learning network phase, unsupervised task construction phase, meta-continual training phase and meta-continual test phase.

Prerequisites

  • Python >= 3.8
  • PyTorch >= 1.5
  • CUDA 10.0

Datasets

Embeddings

You can generate embeddings for Mini-ImageNet and SlimageNet64 using the code of DeepCluster and for Omniglot the code of ACAI or download them from here.

Best models

Available soon.

Usage Example on Omniglot

  1. Download the embeddings from the link above, then set the data_folder variable in the get_embeddings function contained in the dataset/utils.py file equal to your dataset path;
  2. in the file trainers/fusion.py set the arg --dataset equal to the dataset name you want to train on (e.g. Omniglot or Imagenet);
  3. set the arg --attention to exploit the meta-examples and --num_clusters to the desired number of clusters;
  4. run the file trainers/fusion.py.
  • Note that the unsupervised task construction is carried out by the function cactus_unbalance defined in the file dataset/dataset_factory and executed in the trainers/fusion.py file.

Credits

Cite

If you have any questions, please contact [email protected] or [email protected], or open an issue on this repo.

If you find this repository useful for your research, please cite the following paper:

@article{Bertugli2020fusion-me,
  title={Few-Shot Unsupervised Continual Learning through Meta-Examples},
  author={Alessia Bertugli and Stefano Vincenzi and Simone Calderara and Andrea Passerini},
  journal={34rd Conference on Neural Information Processing Systems (NeurIPS 2020), 4th Workshop on Meta-Learning},
  year={2020},
  volume={abs/2009.08107}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].