All Projects → sahagobinda → GPM

sahagobinda / GPM

Licence: MIT license
Official Code Repository for "Gradient Projection Memory for Continual Learning"

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to GPM

cvpr clvision challenge
CVPR 2020 Continual Learning Challenge - Submit your CL algorithm today!
Stars: ✭ 57 (+14%)
Mutual labels:  continual-learning
FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
Stars: ✭ 411 (+722%)
Mutual labels:  continual-learning
OCDVAEContinualLearning
Open-source code for our paper: Unified Probabilistic Deep Continual Learning through Generative Replay and Open Set Recognition
Stars: ✭ 56 (+12%)
Mutual labels:  continual-learning
CPG
Steven C. Y. Hung, Cheng-Hao Tu, Cheng-En Wu, Chien-Hung Chen, Yi-Ming Chan, and Chu-Song Chen, "Compacting, Picking and Growing for Unforgetting Continual Learning," Thirty-third Conference on Neural Information Processing Systems, NeurIPS 2019
Stars: ✭ 91 (+82%)
Mutual labels:  continual-learning
continual-knowledge-learning
[ICLR 2022] Towards Continual Knowledge Learning of Language Models
Stars: ✭ 77 (+54%)
Mutual labels:  continual-learning
Generative Continual Learning
No description or website provided.
Stars: ✭ 51 (+2%)
Mutual labels:  continual-learning
CVPR21 PASS
PyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (+10%)
Mutual labels:  continual-learning
BLIP
Official Implementation of CVPR2021 paper: Continual Learning via Bit-Level Information Preserving
Stars: ✭ 33 (-34%)
Mutual labels:  continual-learning
php-best-practices
What I consider the best practices for web and software development.
Stars: ✭ 60 (+20%)
Mutual labels:  continual-learning
ADER
(RecSys 2020) Adaptively Distilled Exemplar Replay towards Continual Learning for Session-based Recommendation [Best Short Paper]
Stars: ✭ 28 (-44%)
Mutual labels:  continual-learning
Continual Learning Data Former
A pytorch compatible data loader to create sequence of tasks for Continual Learning
Stars: ✭ 32 (-36%)
Mutual labels:  continual-learning
class-incremental-learning
PyTorch implementation of a VAE-based generative classifier, as well as other class-incremental learning methods that do not store data (DGR, BI-R, EWC, SI, CWR, CWR+, AR1, the "labels trick", SLDA).
Stars: ✭ 30 (-40%)
Mutual labels:  continual-learning
FUSION
PyTorch code for NeurIPSW 2020 paper (4th Workshop on Meta-Learning) "Few-Shot Unsupervised Continual Learning through Meta-Examples"
Stars: ✭ 18 (-64%)
Mutual labels:  continual-learning
MetaLifelongLanguage
Repository containing code for the paper "Meta-Learning with Sparse Experience Replay for Lifelong Language Learning".
Stars: ✭ 21 (-58%)
Mutual labels:  continual-learning
reproducible-continual-learning
Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
Stars: ✭ 118 (+136%)
Mutual labels:  continual-learning
course-content-dl
NMA deep learning course
Stars: ✭ 537 (+974%)
Mutual labels:  continual-learning
class-norm
Class Normalization for Continual Zero-Shot Learning
Stars: ✭ 34 (-32%)
Mutual labels:  continual-learning
CVPR2021 PLOP
Official code of CVPR 2021's PLOP: Learning without Forgetting for Continual Semantic Segmentation
Stars: ✭ 102 (+104%)
Mutual labels:  continual-learning
Remembering-for-the-Right-Reasons
Official Implementation of Remembering for the Right Reasons (ICLR 2021)
Stars: ✭ 27 (-46%)
Mutual labels:  continual-learning
Adam-NSCL
PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"
Stars: ✭ 34 (-32%)
Mutual labels:  continual-learning

GPM

Official Pytorch implementation for "Gradient Projection Memory for Continual Learning", ICLR 2021 (Oral).

[Paper] [ICLR Presentation Video]

Abstract

The ability to learn continually without forgetting the past tasks is a desired attribute for artificial learning systems. Existing approaches to enable such learning in artificial neural networks usually rely on network growth, importance based weight update or replay of old data from the memory. In contrast, we propose a novel approach where a neural network learns new tasks by taking gradient steps in the orthogonal direction to the gradient subspaces deemed important for the past tasks. We find the bases of these subspaces by analyzing network representations (activations) after learning each task with Singular Value Decomposition (SVD) in a single shot manner and store them in the memory as Gradient Projection Memory (GPM). With qualitative and quantitative analyses, we show that such orthogonal gradient descent induces minimum to no interference with the past tasks, thereby mitigates forgetting. We evaluate our algorithm on diverse image classification datasets with short and long sequences of tasks and report better or on-par performance compared to the state-of-the-art approaches.

Authors

Gobinda Saha, Isha Garg, Kaushik Roy

Experiments

This repository currently contains experiments reported in the paper for Permuted MNIST, 10-split CIFAR-100, 20-tasks CIFAR-100 Superclass datasets and 5-datasets. All these experiments can be run using the following command:

source run_experiments.sh

Citation

@inproceedings{
saha2021gradient,
title={Gradient Projection Memory for Continual Learning},
author={Gobinda Saha and Isha Garg and Kaushik Roy},
booktitle={International Conference on Learning Representations},
year={2021},
url={https://openreview.net/forum?id=3AOj0RCNC2}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].