All Projects → universome → class-norm

universome / class-norm

Licence: other
Class Normalization for Continual Zero-Shot Learning

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects
shell
77523 projects

Projects that are alternatives of or similar to class-norm

Adam-NSCL
PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"
Stars: ✭ 34 (+0%)
Mutual labels:  lifelong-learning, continual-learning
FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
Stars: ✭ 411 (+1108.82%)
Mutual labels:  lifelong-learning, continual-learning
Generative Continual Learning
No description or website provided.
Stars: ✭ 51 (+50%)
Mutual labels:  lifelong-learning, continual-learning
Remembering-for-the-Right-Reasons
Official Implementation of Remembering for the Right Reasons (ICLR 2021)
Stars: ✭ 27 (-20.59%)
Mutual labels:  lifelong-learning, continual-learning
MetaLifelongLanguage
Repository containing code for the paper "Meta-Learning with Sparse Experience Replay for Lifelong Language Learning".
Stars: ✭ 21 (-38.24%)
Mutual labels:  lifelong-learning, continual-learning
Continual Learning Data Former
A pytorch compatible data loader to create sequence of tasks for Continual Learning
Stars: ✭ 32 (-5.88%)
Mutual labels:  lifelong-learning, continual-learning
reproducible-continual-learning
Continual learning baselines and strategies from popular papers, using Avalanche. We include EWC, SI, GEM, AGEM, LwF, iCarl, GDumb, and other strategies.
Stars: ✭ 118 (+247.06%)
Mutual labels:  lifelong-learning, continual-learning
CVPR21 PASS
PyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (+61.76%)
Mutual labels:  lifelong-learning, continual-learning
cvpr clvision challenge
CVPR 2020 Continual Learning Challenge - Submit your CL algorithm today!
Stars: ✭ 57 (+67.65%)
Mutual labels:  lifelong-learning, continual-learning
CPG
Steven C. Y. Hung, Cheng-Hao Tu, Cheng-En Wu, Chien-Hung Chen, Yi-Ming Chan, and Chu-Song Chen, "Compacting, Picking and Growing for Unforgetting Continual Learning," Thirty-third Conference on Neural Information Processing Systems, NeurIPS 2019
Stars: ✭ 91 (+167.65%)
Mutual labels:  lifelong-learning, continual-learning
SIGIR2021 Conure
One Person, One Model, One World: Learning Continual User Representation without Forgetting
Stars: ✭ 23 (-32.35%)
Mutual labels:  lifelong-learning, continual-learning
enelvo
A flexible normalizer for user-generated content
Stars: ✭ 28 (-17.65%)
Mutual labels:  normalization
MetaBIN
[CVPR2021] Meta Batch-Instance Normalization for Generalizable Person Re-Identification
Stars: ✭ 58 (+70.59%)
Mutual labels:  normalization
ling
Natural Language Processing Toolkit in Golang
Stars: ✭ 57 (+67.65%)
Mutual labels:  normalization
RainNet
[CVPR 2021] Region-aware Adaptive Instance Normalization for Image Harmonization
Stars: ✭ 125 (+267.65%)
Mutual labels:  normalization
continual-knowledge-learning
[ICLR 2022] Towards Continual Knowledge Learning of Language Models
Stars: ✭ 77 (+126.47%)
Mutual labels:  continual-learning
normalize attributes
Sometimes you want to normalize data before saving it to the database like down casing e-mails, removing spaces and so on. This Rails plugin allows you to do so in a simple way.
Stars: ✭ 41 (+20.59%)
Mutual labels:  normalization
PlotTwist
PlotTwist - a web app for plotting and annotating time-series data
Stars: ✭ 21 (-38.24%)
Mutual labels:  normalization
go-email-normalizer
Golang library for providing a canonical representation of email address.
Stars: ✭ 54 (+58.82%)
Mutual labels:  normalization
ANCOMBC
Differential abundance (DA) and correlation analyses for microbial absolute abundance data
Stars: ✭ 60 (+76.47%)
Mutual labels:  normalization

About

This repo contains the code for the Class Normalization for Continual Zero-Shot Learning paper from ICLR 2021:

  • the code to reproduce ZSL and CZSL results
  • the proposed CZSL metrics (located in src/utils/metrics.py)
  • fast python implementation of the AUSUC metric

[arXiv Paper] [Google Colab] [OpenReview Paper]

In this project, we explored different normalization strategies used in ZSL and proposed a new one (class normalization) that is suited for deep attribute embedders. This allowed us to outperform the existing ZSL model with a simple 3-layer MLP trained just in 30 seconds. Also, we extended ZSL ideas into a more generalized setting: Continual Zero-Shot Learning, proposed a set of metrics for it and tested several baselines.

Class Normalization illustration

Installation & training

Data preparation

For ZSL

For ZSL, we tested our method on the standard GBU datasets which you can download from the original website. It is the easiest to follow our Google Colab to reproduce the results.

For CZSL

For CZSL, we tested our method on SUN and CUB datasets. In contrast to ZSL, in CZSL we used raw images as inputs instead of an ImageNet-pretrained model's features. For CUB, please follow the instructions in the A-GEM repo. Note, that CUB images dataset are now to be downloaded manually from here, but we used the same splits as A-GEM. Put the A-GEM splits into the CUB data folder.

For SUN, download the data from the official website, put it under data/SUN and then follow the instructions in scripts/sun_data_preprocessing.py

Installing the firelab dependency

You will need to install firelab library to run the training:

pip install firelab

Running ZSL training

Please, refer to this Google Colab notebook: it contains the code to reproduce our results.

Running CZSL training

To run CZSL training you will need to run the command:

python src/run.py -c basic|agem|mas|joint -d cub|sun

Please note, that by default we load all the data into memory (to speed up things). This behaviour is controled by the in_memory flag in the config.

Results

Zero-shot learning results

ZSL results

Continual Zero-Shot Learning results

CZSL results

Training speed results for ZSL

Training speed results
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].