All Projects → TLESORT → Continual_Learning_Data_Former

TLESORT / Continual_Learning_Data_Former

Licence: MIT License
A pytorch compatible data loader to create sequence of tasks for Continual Learning

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Continual Learning Data Former

cvpr clvision challenge
CVPR 2020 Continual Learning Challenge - Submit your CL algorithm today!
Stars: ✭ 57 (+78.13%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
CVPR21 PASS
PyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (+71.88%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
Adam-NSCL
PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"
Stars: ✭ 34 (+6.25%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
Stars: ✭ 411 (+1184.38%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
Generative Continual Learning
No description or website provided.
Stars: ✭ 51 (+59.38%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-28.12%)
Mutual labels:  lifelong-learning, continual-learning
reproducible-continual-learning
Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
Stars: ✭ 118 (+268.75%)
Mutual labels:  lifelong-learning, continual-learning
FUSION
PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"
Stars: ✭ 18 (-43.75%)
Mutual labels:  incremental-learning, continual-learning
MetaLifelongLanguage
Repository containing code for the paper "Meta-Learning with Sparse Experience Replay for Lifelong Language Learning".
Stars: ✭ 21 (-34.37%)
Mutual labels:  lifelong-learning, continual-learning
class-norm
Class Normalization for Continual Zero-Shot Learning
Stars: ✭ 34 (+6.25%)
Mutual labels:  lifelong-learning, continual-learning
CPG
Steven C. Y. Hung, Cheng-Hao Tu, Cheng-En Wu, Chien-Hung Chen, Yi-Ming Chan, and Chu-Song Chen, "Compacting, Picking and Growing for Unforgetting Continual Learning," Thirty-third Conference on Neural Information Processing Systems, NeurIPS 2019
Stars: ✭ 91 (+184.38%)
Mutual labels:  lifelong-learning, continual-learning
Remembering-for-the-Right-Reasons
Official Implementation of Remembering for the Right Reasons (ICLR 2021)
Stars: ✭ 27 (-15.62%)
Mutual labels:  lifelong-learning, continual-learning
GPM
Official Code Repository for "Gradient Projection Memory for Continual Learning"
Stars: ✭ 50 (+56.25%)
Mutual labels:  continual-learning
OCDVAEContinualLearning
Open-source code for our paper: Unified Probabilistic Deep Continual Learning through Generative Replay and Open Set Recognition
Stars: ✭ 56 (+75%)
Mutual labels:  continual-learning
CVPR2021 PLOP
Official code of CVPR 2021's PLOP: Learning without Forgetting for Continual Semantic Segmentation
Stars: ✭ 102 (+218.75%)
Mutual labels:  continual-learning
ADER
(RecSys 2020) Adaptively Distilled Exemplar Replay towards Continual Learning for Session-based Recommendation [Best Short Paper]
Stars: ✭ 28 (-12.5%)
Mutual labels:  continual-learning
lifelong-learning
lifelong learning: record and analysis of my knowledge structure
Stars: ✭ 18 (-43.75%)
Mutual labels:  lifelong-learning
cvaecaposr
Code for the Paper: "Conditional Variational Capsule Network for Open Set Recognition", Y. Guo, G. Camporese, W. Yang, A. Sperduti, L. Ballan, arXiv:2104.09159, 2021.
Stars: ✭ 29 (-9.37%)
Mutual labels:  incremental-learning
HebbianMetaLearning
Meta-Learning through Hebbian Plasticity in Random Networks: https://arxiv.org/abs/2007.02686
Stars: ✭ 77 (+140.63%)
Mutual labels:  lifelong-learning
BLIP
Official Implementation of CVPR2021 paper: Continual Learning via Bit-Level Information Preserving
Stars: ✭ 33 (+3.13%)
Mutual labels:  continual-learning

Continuum: A dataloader for continual learning

Codacy Badge DOI

Intro

This repositery proprose several script to create sequence of tasks for continual learning. The spirit is the following : Instead of managing the sequence of tasks while learning, we create the sequence of tasks first and then we load tasks one by one while learning.

It makes programming easier and code cleaner.

Installation

git clone https://github.com/TLESORT/Continual_Learning_Data_Former
cd Continual_Learning_Data_Former
pip install .

Few possible invocations

  • Disjoint tasks
from continuum.disjoint import Disjoint

#MNIST with 10 tasks of one class
continuum = Disjoint(path="./Data", dataset="MNIST", task_number=10, download=True, train=True)
  • Rotations tasks
from continuum.rotations import Rotations

#MNIST with 5 tasks with various rotations
continuum = Rotations(path="./Data", dataset="MNIST", tasks_number=5, download=True, train=True, min_rot=0.0,
                 max_rot=90.0)
  • Permutations tasks
from continuum.permutations import Permutations

#MNIST with 5 tasks with different permutations
continuum = Permutations(path="./Data", dataset="MNIST", tasks_number=1, download=False, train=True)

Use example

from continuum.disjoint import Disjoint
from torch.utils import data

# create continuum dataset
continuum = Disjoint(path=".", dataset="MNIST", task_number=10, download=True, train=True)

# create pytorch dataloader
train_loader = data.DataLoader(data_set, batch_size=64, shuffle=True, num_workers=6)

#set the task on 0 for example with the data_set
continuum.set_task(0)

# iterate on task 0
for t, (data, target) in enumerate(train_loader):
    print(target)
    
#change the task to 2 for example
continuum.set_task(2)

# iterate on task 2
for t, (data, target) in enumerate(train_loader):
    print(target)

# We can visualize samples from the sequence of tasks
for i in range(10):
    continuum.set_task(i)
    
    folder = "./Samples/disjoint_10_tasks/"
    
    if not os.path.exists(folder):
        os.makedirs(folder)
    
    path_samples = os.path.join(folder, "MNIST_task_{}.png".format(i))
    continuum.visualize_sample(path_samples , number=100, shape=[28,28,1])
    

Task sequences possibilities

  • Disjoint tasks : each task propose new classes
  • Rotations tasks : each tasks propose same data but with different rotations of datata point
  • Permutations tasks : each tasks propose same data but with different permutations of pixels
  • Mnist Fellowship task : each task is a new mnist like dataset (this sequence of task is an original contribution of this repository)

An example with MNIST 5 dijoint tasks

Task 0 Task 1 Task 2 Task 3 Task 4

More examples at Samples

Datasets

  • Mnist
  • fashion-Mnist
  • kmnist
  • cifar10
  • Core50/Core10

Some supplementary option are possible

  • The number of tasks can be choosed (1, 3, 5 and 10 have been tested normally)
  • Classes order can be shuffled for disjoint tasks
  • We can choose the magnitude of rotation for rotations mnist

Citing the Project

@software{timothee_lesort_2020_3605202,
  author       = {Timothée LESORT},
  title        = {Continual Learning Data Former},
  month        = jan,
  year         = 2020,
  publisher    = {Zenodo},
  version      = {v1.0},
  doi          = {10.5281/zenodo.3605202},
  url          = {https://doi.org/10.5281/zenodo.3605202}
}

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].