All Projects → khurramjaved96 → Incremental Learning

khurramjaved96 / Incremental Learning

Pytorch implementation of ACCV18 paper "Revisiting Distillation and Incremental Classifier Learning."

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Incremental Learning

Automatic Image Captioning
Generating Captions for images using Deep Learning
Stars: ✭ 84 (-8.7%)
Mutual labels:  convolutional-neural-networks
Text classification
Text Classification Algorithms: A Survey
Stars: ✭ 1,276 (+1286.96%)
Mutual labels:  convolutional-neural-networks
Fast Autoaugment
Official Implementation of 'Fast AutoAugment' in PyTorch.
Stars: ✭ 1,297 (+1309.78%)
Mutual labels:  convolutional-neural-networks
Cnn Graph Classification
A convolutional neural network for graph classification in PyTorch
Stars: ✭ 84 (-8.7%)
Mutual labels:  convolutional-neural-networks
Reinforcement Learning For Self Driving Cars
Project on design and implement neural network that maximises driving speed of self-driving car through reinforcement learning.
Stars: ✭ 85 (-7.61%)
Mutual labels:  convolutional-neural-networks
Malware Classification
Towards Building an Intelligent Anti-Malware System: A Deep Learning Approach using Support Vector Machine for Malware Classification
Stars: ✭ 88 (-4.35%)
Mutual labels:  convolutional-neural-networks
Fashion Mnist
A MNIST-like fashion product database. Benchmark 👇
Stars: ✭ 9,675 (+10416.3%)
Mutual labels:  convolutional-neural-networks
3dunet abdomen cascade
Stars: ✭ 91 (-1.09%)
Mutual labels:  convolutional-neural-networks
Niftynet
[unmaintained] An open-source convolutional neural networks platform for research in medical image analysis and image-guided therapy
Stars: ✭ 1,276 (+1286.96%)
Mutual labels:  convolutional-neural-networks
Kerasr
R interface to the keras library
Stars: ✭ 90 (-2.17%)
Mutual labels:  convolutional-neural-networks
Tf Mobilenet V2
Mobilenet V2(Inverted Residual) Implementation & Trained Weights Using Tensorflow
Stars: ✭ 85 (-7.61%)
Mutual labels:  convolutional-neural-networks
Breast Cancer Classification
Breast Cancer Classification using CNN and transfer learning
Stars: ✭ 86 (-6.52%)
Mutual labels:  convolutional-neural-networks
Deep Learning For Beginners
videos, lectures, blogs for Deep Learning
Stars: ✭ 89 (-3.26%)
Mutual labels:  convolutional-neural-networks
Pynq Dl
Xilinx Deep Learning IP
Stars: ✭ 84 (-8.7%)
Mutual labels:  convolutional-neural-networks
Image Quality Assessment
Convolutional Neural Networks to predict the aesthetic and technical quality of images.
Stars: ✭ 1,300 (+1313.04%)
Mutual labels:  convolutional-neural-networks
Tnn
Biologically-realistic recurrent convolutional neural networks
Stars: ✭ 83 (-9.78%)
Mutual labels:  convolutional-neural-networks
Capsnet Pytorch
My attempt at implementing CapsNet from the paper Dynamic Routing Between Capsules
Stars: ✭ 87 (-5.43%)
Mutual labels:  convolutional-neural-networks
Core50
CORe50: a new Dataset and Benchmark for Continual Learning
Stars: ✭ 91 (-1.09%)
Mutual labels:  convolutional-neural-networks
Label Reg
(This repo is no longer up-to-date. Any updates will be at https://github.com/DeepRegNet/DeepReg/) A demo of the re-factored label-driven registration code, based on "Weakly-supervised convolutional neural networks for multimodal image registration"
Stars: ✭ 91 (-1.09%)
Mutual labels:  convolutional-neural-networks
Trained Ternary Quantization
Reducing the size of convolutional neural networks
Stars: ✭ 90 (-2.17%)
Mutual labels:  convolutional-neural-networks

Revisiting Distillation and Incremental Classifier Learning

Accepted at ACCV18. Pre-print is available at : http://arxiv.org/abs/1807.02802

Citing the paper :

@inproceedings{javed2018revisiting,
  title={Revisiting distillation and incremental classifier learning},
  author={Javed, Khurram and Shafait, Faisal},
  booktitle={Asian Conference on Computer Vision},
  pages={3--17},
  year={2018},
  organization={Springer}
}

Interface to Run Experiments

usage: runExperiment.py [-h] [--batch-size N] [--lr LR]
                        [--schedule SCHEDULE [SCHEDULE ...]]
                        [--gammas GAMMAS [GAMMAS ...]] [--momentum M]
                        [--no-cuda] [--random-init] [--no-distill]
                        [--distill-only-exemplars] [--no-random]
                        [--no-herding] [--seeds SEEDS [SEEDS ...]]
                        [--log-interval N] [--model-type MODEL_TYPE]
                        [--name NAME] [--outputDir OUTPUTDIR] [--upsampling]
                        [--pp] [--distill-step] [--hs]
                        [--unstructured-size UNSTRUCTURED_SIZE]
                        [--alphas ALPHAS [ALPHAS ...]] [--decay DECAY]
                        [--alpha-increment ALPHA_INCREMENT] [--l1 L1]
                        [--step-size STEP_SIZE] [--T T]
                        [--memory-budgets MEMORY_BUDGETS [MEMORY_BUDGETS ...]]
                        [--epochs-class EPOCHS_CLASS] [--dataset DATASET]
                        [--lwf] [--no-nl] [--rand] [--adversarial]

Default configurations can be used to run with same parameters as used by iCaRL. Simply run:

python run_experiment.py

Dependencies

  1. Pytorch 0.3.0.post4
  2. Python 3.6
  3. torchnet (https://github.com/pytorch/tnt)
  4. tqdm (pip install tqdm)

Please see requirements.txt for a complete list.

Setting up enviroment

The easiest way to install the required dependencies is to use conda package manager.

  1. Install Anaconda with Python 3
  2. Install pytorch and torchnet
  3. Install tqdm (pip install progressbar2) Done.

Branches

  1. iCaRL + Dynamic Threshold Moving is implemented in "Autoencoders" branch.

=======

Selected Results

Removing Bias by Dynamic Threshold Moving

alt text Result of threshold moving with T = 2 and 5. Note that different scale is used for the y axis, and using higher temperature in general results in less bias.

Confusion Matrix with and without Dynamic Threshold Moving

alt text Confusion matrix of results of the classifier with (right) and without (left) threshold moving with T=2. We removed the first five classes of MNIST from the train set and only distilled the knowledge of these classes using a network trained on all classes. Without threshold moving the model struggled on the older classes. With threshold moving, however, not only was it able to classify unseen classes nearly perfectly, but also its performance did not deteriorate on new classes

FAQs

How do I implement more models?

A. Add the model in model/ModelFactory and make sure the forward method of the model satisfy the API of model/resnet32.py

How do I add a new dataset?

A. Add the new dataset in DatasetFactory and specify the details in the dataHandler/dataset.py class. Make sure the dataset implements all the variables set by other datasets.

References

[1] Geoffrey Hinton, Oriol Vinyals, and Jeff Dean. Distilling the knowledge in a neural network. arXiv preprint arXiv:1503.02531, 2015

[2] Sylvestre-Alvise Rebuffi, Alexander Kolesnikov, Georg Sperl, and Christoph H Lampert. Icarl: Incremental classifier and representation learning. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 2001–2010, 2017.

[3] Zhizhong Li and Derek Hoiem. Learning without forgetting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2017.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].