All Projects → mmasana → FACIL

mmasana / FACIL

Licence: MIT license
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to FACIL

Generative Continual Learning
No description or website provided.
Stars: ✭ 51 (-87.59%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
Adam-NSCL
PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"
Stars: ✭ 34 (-91.73%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
Continual Learning Data Former
A pytorch compatible data loader to create sequence of tasks for Continual Learning
Stars: ✭ 32 (-92.21%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
CVPR21 PASS
PyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (-86.62%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
cvpr clvision challenge
CVPR 2020 Continual Learning Challenge - Submit your CL algorithm today!
Stars: ✭ 57 (-86.13%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
class-norm
Class Normalization for Continual Zero-Shot Learning
Stars: ✭ 34 (-91.73%)
Mutual labels:  lifelong-learning, continual-learning
MetaLifelongLanguage
Repository containing code for the paper "Meta-Learning with Sparse Experience Replay for Lifelong Language Learning".
Stars: ✭ 21 (-94.89%)
Mutual labels:  lifelong-learning, continual-learning
reproducible-continual-learning
Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
Stars: ✭ 118 (-71.29%)
Mutual labels:  lifelong-learning, continual-learning
CPG
Steven C. Y. Hung, Cheng-Hao Tu, Cheng-En Wu, Chien-Hung Chen, Yi-Ming Chan, and Chu-Song Chen, "Compacting, Picking and Growing for Unforgetting Continual Learning," Thirty-third Conference on Neural Information Processing Systems, NeurIPS 2019
Stars: ✭ 91 (-77.86%)
Mutual labels:  lifelong-learning, continual-learning
Remembering-for-the-Right-Reasons
Official Implementation of Remembering for the Right Reasons (ICLR 2021)
Stars: ✭ 27 (-93.43%)
Mutual labels:  lifelong-learning, continual-learning
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-94.4%)
Mutual labels:  lifelong-learning, continual-learning
FUSION
PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"
Stars: ✭ 18 (-95.62%)
Mutual labels:  incremental-learning, continual-learning
formr.org
Chain simple surveys into longer runs to build complex studies. Use R to generate pretty feedback and complex designs.
Stars: ✭ 90 (-78.1%)
Mutual labels:  reproducible-research, survey
Avalanche
Avalanche: a End-to-End Library for Continual Learning.
Stars: ✭ 151 (-63.26%)
Mutual labels:  reproducible-research
osposurvey
Open Source Programs (OSPO) Survey
Stars: ✭ 66 (-83.94%)
Mutual labels:  survey
Liftr
🐳 Containerize R Markdown documents for continuous reproducibility
Stars: ✭ 155 (-62.29%)
Mutual labels:  reproducible-research
Pytorch Vae
A Collection of Variational Autoencoders (VAE) in PyTorch.
Stars: ✭ 2,704 (+557.91%)
Mutual labels:  reproducible-research
targets-tutorial
Short course on the targets R package
Stars: ✭ 87 (-78.83%)
Mutual labels:  reproducible-research
Containerit
Package an R workspace and all dependencies as a Docker container
Stars: ✭ 248 (-39.66%)
Mutual labels:  reproducible-research
Hash Embeddings
PyTorch implementation of Hash Embeddings (NIPS 2017). Submission to the NIPS Implementation Challenge.
Stars: ✭ 126 (-69.34%)
Mutual labels:  reproducible-research

Framework for Analysis of Class-Incremental Learning


What is FACILKey FeaturesHow To UseApproachesDatasetsNetworksLicenseCite


What is FACIL

FACIL started as code for the paper:
Class-incremental learning: survey and performance evaluation
Marc Masana, Xialei Liu, Bartlomiej Twardowski, Mikel Menta, Andrew D. Bagdanov, Joost van de Weijer
(arxiv)

It allows to reproduce the results in the paper as well as provide a (hopefully!) helpful framework to develop new methods for incremental learning and analyse existing ones. Our idea is to expand the available approaches and tools with the help of the community. To help FACIL grow, don't forget to star this github repository and share it to friends and coworkers!

Key Features

We provide a framework based on class-incremental learning. However, task-incremental learning is also fully supported. Experiments by default provide results on both task-aware and task-agnostic evaluation. Furthermore, if an experiment runs with one task on one dataset, results would be equivalent to 'common' supervised learning.

Setting task-ID at train time task-ID at test time # of tasks
class-incremental learning yes no ≥1
task-incremental learning yes yes ≥1
non-incremental supervised learning yes yes 1

Current available approaches include:

Finetuning • Freezing • Joint

LwF • iCaRL • EWC • PathInt • MAS • RWalk • EEIL • LwM • DMC • BiC • LUCIR • IL2M

How To Use

Clone this github repository:

git clone https://github.com/mmasana/FACIL.git
cd FACIL
Optionally, create an environment to run the code (click to expand).

Using a requirements file

The library requirements of the code are detailed in requirements.txt. You can install them using pip with:

python3 -m pip install -r requirements.txt

Using a conda environment

Development environment based on Conda distribution. All dependencies are in environment.yml file.

Create env

To create a new environment check out the repository and type:

conda env create --file environment.yml --name FACIL

Notice: set the appropriate version of your CUDA driver for cudatoolkit in environment.yml.

Environment activation/deactivation

conda activate FACIL
conda deactivate

To run the basic code:

python3 -u src/main_incremental.py

More options are explained in the src, including GridSearch usage. Also, more specific options on approaches, loggers, datasets and networks.

Scripts

We provide scripts to reproduce the specific scenarios proposed in Class-incremental learning: survey and performance evaluation:

  • CIFAR-100 (10 tasks) with ResNet-32 without exemplars
  • CIFAR-100 (10 tasks) with ResNet-32 with fixed and growing memory
  • MORE COMING SOON...

All scripts run 10 times to later calculate mean and standard deviation of the results. Check out all available in the scripts folder.

License

Please check the MIT license that is listed in this repository.

Cite

If you want to cite the framework feel free to use this preprint citation while we await publication:

@article{masana2022class,
  title={Class-Incremental Learning: Survey and Performance Evaluation on Image Classification},
  author={Masana, Marc and Liu, Xialei and Twardowski, Bartlomiej and Menta, Mikel and Bagdanov, Andrew D and van de Weijer, Joost},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence},
  doi={10.1109/TPAMI.2022.3213473},
  pages={1-20},
  year={2022}
}

The basis of FACIL is made possible thanks to Marc Masana, Xialei Liu, Bartlomiej Twardowski and Mikel Menta. Code structure is inspired by HAT. Feel free to contribute or propose new features by opening an issue!

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].