class-incremental-learningPyTorch implementation of a VAE-based generative classifier, as well as other class-incremental learning methods that do not store data (DGR, BI-R, EWC, SI, CWR, CWR+, AR1, the "labels trick", SLDA).
SIGIR2021 ConureOne Person, One Model, One World: Learning Continual User Representation without Forgetting
CPGSteven C. Y. Hung, Cheng-Hao Tu, Cheng-En Wu, Chien-Hung Chen, Yi-Ming Chan, and Chu-Song Chen, "Compacting, Picking and Growing for Unforgetting Continual Learning," Thirty-third Conference on Neural Information Processing Systems, NeurIPS 2019
MetaLifelongLanguageRepository containing code for the paper "Meta-Learning with Sparse Experience Replay for Lifelong Language Learning".
CVPR21 PASSPyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
GPMOfficial Code Repository for "Gradient Projection Memory for Continual Learning"
CVPR2021 PLOPOfficial code of CVPR 2021's PLOP: Learning without Forgetting for Continual Semantic Segmentation
BLIPOfficial Implementation of CVPR2021 paper: Continual Learning via Bit-Level Information Preserving
reproducible-continual-learningContinual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
OCDVAEContinualLearningOpen-source code for our paper: Unified Probabilistic Deep Continual Learning through Generative Replay and Open Set Recognition
ADER(RecSys 2020) Adaptively Distilled Exemplar Replay towards Continual Learning for Session-based Recommendation [Best Short Paper]
Adam-NSCLPyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"
FUSIONPyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"
class-normClass Normalization for Continual Zero-Shot Learning
FACILFramework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.