All Projects → TLESORT → Generative_Continual_Learning

TLESORT / Generative_Continual_Learning

Licence: MIT license
No description or website provided.

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to Generative Continual Learning

CVPR21 PASS
PyTorch implementation of our CVPR2021 (oral) paper "Prototype Augmentation and Self-Supervision for Incremental Learning"
Stars: ✭ 55 (+7.84%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
cvpr clvision challenge
CVPR 2020 Continual Learning Challenge - Submit your CL algorithm today!
Stars: ✭ 57 (+11.76%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
Continual Learning Data Former
A pytorch compatible data loader to create sequence of tasks for Continual Learning
Stars: ✭ 32 (-37.25%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
precision-recall-distributions
Assessing Generative Models via Precision and Recall (official repository)
Stars: ✭ 80 (+56.86%)
Mutual labels:  generative-adversarial-network, variational-autoencoder, generative-models
Adam-NSCL
PyTorch implementation of our Adam-NSCL algorithm from our CVPR2021 (oral) paper "Training Networks in Null Space for Continual Learning"
Stars: ✭ 34 (-33.33%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
FACIL
Framework for Analysis of Class-Incremental Learning with 12 state-of-the-art methods and 3 baselines.
Stars: ✭ 411 (+705.88%)
Mutual labels:  incremental-learning, lifelong-learning, continual-learning
tt-vae-gan
Timbre transfer with variational autoencoding and cycle-consistent adversarial networks. Able to transfer the timbre of an audio source to that of another.
Stars: ✭ 37 (-27.45%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
private-data-generation
A toolbox for differentially private data generation
Stars: ✭ 80 (+56.86%)
Mutual labels:  generative-adversarial-network, generative-models
deep-blueberry
If you've always wanted to learn about deep-learning but don't know where to start, then you might have stumbled upon the right place!
Stars: ✭ 17 (-66.67%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Deep Generative Models
Deep generative models implemented with TensorFlow 2.0: eg. Restricted Boltzmann Machine (RBM), Deep Belief Network (DBN), Deep Boltzmann Machine (DBM), Convolutional Variational Auto-Encoder (CVAE), Convolutional Generative Adversarial Network (CGAN)
Stars: ✭ 34 (-33.33%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Generative models tutorial with demo
Generative Models Tutorial with Demo: Bayesian Classifier Sampling, Variational Auto Encoder (VAE), Generative Adversial Networks (GANs), Popular GANs Architectures, Auto-Regressive Models, Important Generative Model Papers, Courses, etc..
Stars: ✭ 276 (+441.18%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Repo 2017
Python codes in Machine Learning, NLP, Deep Learning and Reinforcement Learning with Keras and Theano
Stars: ✭ 1,123 (+2101.96%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
lagvae
Lagrangian VAE
Stars: ✭ 27 (-47.06%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
vaegan
An implementation of VAEGAN (variational autoencoder + generative adversarial network).
Stars: ✭ 88 (+72.55%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Textbox
TextBox is an open-source library for building text generation system.
Stars: ✭ 257 (+403.92%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Pytorch Rl
This repository contains model-free deep reinforcement learning algorithms implemented in Pytorch
Stars: ✭ 394 (+672.55%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Focal Frequency Loss
Focal Frequency Loss for Generative Models
Stars: ✭ 141 (+176.47%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (+162.75%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Video prediction
Stochastic Adversarial Video Prediction
Stars: ✭ 247 (+384.31%)
Mutual labels:  generative-adversarial-network, variational-autoencoder
Rectorch
rectorch is a pytorch-based framework for state-of-the-art top-N recommendation
Stars: ✭ 121 (+137.25%)
Mutual labels:  generative-adversarial-network, variational-autoencoder

Generative Models from the perspective of Continual Learning

Timothée Lesort, Hugo Caselles-Dupré, Michael Garcia-Ortiz, Andrei Stoian, David Filliat; IJCNN 2019, Budapest

Abstract

Which generative model is the most suitable for Continual Learning? This paper aims at evaluating and comparing generative models on disjoint sequential image generation tasks.
We investigate how several models learn and forget, considering various strategies: rehearsal, regularization, generative replay and fine-tuning. We used two quantitative metrics to estimate the generation quality and memory ability. We experiment with sequential tasks on three commonly used benchmarks for Continual Learning (MNIST, Fashion MNIST and CIFAR10).
We found that among all models, the original GAN performs best and among Continual Learning strategies, generative replay outperforms all other methods. Even if we found satisfactory combinations on MNIST and Fashion MNIST, training generative models sequentially on CIFAR10 is particularly instable, and remains a challenge.

Sequence of Task

Example of generative tasks sequence and generation capability to reach.

Citing the Project

@inproceedings{lesort2019generative,
  title={Generative models from the perspective of continual learning},
  author={Lesort, Timoth{\'e}e and Caselles-Dupr{\'e}, Hugo and Garcia-Ortiz, Michael and Stoian, Andrei and Filliat, David},
  booktitle={2019 International Joint Conference on Neural Networks (IJCNN)},
  pages={1--8},
  year={2019},
  organization={IEEE}
}

Installation

Clone Repos

git clone https://github.com/TLESORT/Generation_Incremental.git

Create Set-up

Manual

pytorch 0.4
torchvision 0.2.1
imageio 2.2.0
tqdm 4.19.5

Conda environmnet

conda env create -f environment.yml
source activate py36

Docker environmnet

TODO

Experiments Done

Dataset

  • MNIST
  • Fashion MNIST

Generative Models

  • GAN
  • CGAN
  • WGAN
  • WGAN_GP
  • VAE
  • CVAE

Task

  • Disjoint tasks -> 10 tasks

To Add

  • Cifar10

Run experiments

cd Scripts
./generate_test.sh
./test_todo.sh

NB : Test todo will contains all bash commands to run since it may takes some days to run them all you can choose one of them manually and run it in the main repository Manual Example of commands for training and evaluating Generative_replay with GAN on Mnist :

Generate Data

cd ./Data
#For the expert
python main_data.py --task disjoint --dataset mnist --n_tasks 1 --dir ../Archives
#For the models to train
python main_data.py --task disjoint --dataset mnist --n_tasks 10 --dir ../Archives
#For Upperbound and FID
python main_data.py --task disjoint --upperbound True --dataset mnist --n_tasks 10 --dir ../Archives

# Go back to main repo
cd ..

Train Expert to compute later FID

python main.py --context Classification --task_type disjoint --method Baseline --dataset mnist --epochs 50 --epoch_Review 50 --num_task 1 --seed 0 --dir ./Archives

Train Generator

python main.py --context Generation --task_type disjoint --method Generative_Replay --dataset mnist --epochs 50 --num_task 10 --gan_type GAN --train_G True --seed 0 --dir ./Archives

Review Generator with Fitting Capacity

python main.py --context Generation --task_type disjoint --method Generative_Replay --dataset mnist --epochs 50 --num_task 10 --gan_type GAN --Fitting_capacity True --seed 0 --dir ./Archives

Review Generator with FID

python main.py --context Generation --task_type disjoint --method Generative_Replay --dataset mnist --epochs 50 --num_task 10 --gan_type GAN --FID True --seed 0 --dir ./Archives

print figures

Go to the main repository

Plot Fitting Capacity

python print_figures.py --fitting_capacity True

Plot FID

python print_figures.py --FID True
Fitting capacity : GAN MNIST
Fitting capacity at each task : GAN MNIST
FID : GAN MNIST
Fashion-Mnist at each task results.

Plot Samples

Samples MNIST

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].