All Projects → IdanAzuri → glico-learning-small-sample

IdanAzuri / glico-learning-small-sample

Licence: MIT license
Generative Latent Implicit Conditional Optimization when Learning from Small Sample ICPR 20'

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to glico-learning-small-sample

Jukebox
Code for the paper "Jukebox: A Generative Model for Music"
Stars: ✭ 4,863 (+24215%)
Mutual labels:  paper, generative-model
Alae
[CVPR2020] Adversarial Latent Autoencoders
Stars: ✭ 3,178 (+15790%)
Mutual labels:  paper, generative-model
CHyVAE
Code for our paper -- Hyperprior Induced Unsupervised Disentanglement of Latent Representations (AAAI 2019)
Stars: ✭ 18 (-10%)
Mutual labels:  paper, generative-model
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (+845%)
Mutual labels:  paper, generative-model
Awesome Domain Adaptation
A collection of AWESOME things about domian adaptation
Stars: ✭ 3,357 (+16685%)
Mutual labels:  paper
Awesome Gans And Deepfakes
A curated list of GAN & Deepfake papers and repositories.
Stars: ✭ 224 (+1020%)
Mutual labels:  paper
Cardboard
The Bukkit/Spigot/Paper API implementation for Fabric
Stars: ✭ 220 (+1000%)
Mutual labels:  paper
Research Paper Notes
Notes and Summaries on ML-related Research Papers (with optional implementations)
Stars: ✭ 218 (+990%)
Mutual labels:  paper
deep-learning-resources
A curated list of deep learning resources books, courses, papers, libraries, conferences, sample code, and many more.
Stars: ✭ 101 (+405%)
Mutual labels:  paper
protonet-bert-text-classification
finetune bert for small dataset text classification in a few-shot learning manner using ProtoNet
Stars: ✭ 28 (+40%)
Mutual labels:  small-dataset
Linformer Pytorch
My take on a practical implementation of Linformer for Pytorch.
Stars: ✭ 239 (+1095%)
Mutual labels:  paper
Vehicle reid Collection
🚗 the collection of vehicle re-ID papers, datasets. 🚗
Stars: ✭ 225 (+1025%)
Mutual labels:  paper
Browser Sec Whitepaper
Cure53 Browser Security White Paper
Stars: ✭ 251 (+1155%)
Mutual labels:  paper
Triplet Attention
Official PyTorch Implementation for "Rotate to Attend: Convolutional Triplet Attention Module." [WACV 2021]
Stars: ✭ 222 (+1010%)
Mutual labels:  paper
RCAPapers
Papers about Root Cause Analysis in MicroService Systems. Reference to Paper Notes: https://dreamhomes.top/
Stars: ✭ 89 (+345%)
Mutual labels:  paper
Nfnets Pytorch
NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch
Stars: ✭ 215 (+975%)
Mutual labels:  paper
Enet Real Time Semantic Segmentation
ENet - A Neural Net Architecture for real time Semantic Segmentation
Stars: ✭ 238 (+1090%)
Mutual labels:  paper
Pwc
Papers with code. Sorted by stars. Updated weekly.
Stars: ✭ 15,288 (+76340%)
Mutual labels:  paper
Robosumo
Code for the paper "Continuous Adaptation via Meta-Learning in Nonstationary and Competitive Environments"
Stars: ✭ 234 (+1070%)
Mutual labels:  paper
Gpt 2
Code for the paper "Language Models are Unsupervised Multitask Learners"
Stars: ✭ 15,142 (+75610%)
Mutual labels:  paper

Python 3.6 PWC PWC PWC PWC

Generative Latent Implicit Conditional Optimization when Learning from Small Sample

[Paper] [Poster] [Talk]

GLICO: Generative Latent Implicit Conditional Optimization when Learning from Small Sample, accepted to ICPR 2020
Idan Azuri, Daphna Weinshall

Left: Examples of synthesized images.Each row shows five new images
(the intermediate columns), generated based on smooth interpolation in the
latent space between two reconstructed images (the left and right columns).

Right: Comparison of Top-1 Accuracy (including STE) for CIFAR-10 using
WideResnet-28, with a different number of training examples per class (labeled
data only). 

If you find this repository useful in your research, please cite the following paper:

@INPROCEEDINGS {9413259,
author = {I. Azuri and D. Weinshall},
booktitle = {2020 25th International Conference on Pattern Recognition (ICPR)},
title = {Generative Latent Implicit Conditional Optimization when Learning from Small Sample},
year = {2021},
volume = {},
issn = {1051-4651},
pages = {8584-8591},
keywords = {training;interpolation;generators;pattern recognition;optimization;image classification},
doi = {10.1109/ICPR48806.2021.9413259},
url = {https://doi.ieeecomputersociety.org/10.1109/ICPR48806.2021.9413259},
publisher = {IEEE Computer Society},
address = {Los Alamitos, CA, USA},
month = {jan}
}

1. Requirements

  • torch>= 1.3.0

  • torchvision>=0.4.2

  • easyargs

dir=path-to-repo/learning-from-small-sample/glico_model
cd $dir

2. Datasets

The following datasets have been used in the paper:

To experiment with differently sized variants of the CUB dataset, download the modified image list files files and unzip the obtained archive into the root directory of your CUB dataset

3. Multiple shots on CUB

UNLABELED=10
SEED=0

for SHOTS in 5 10 20 30; do
echo "glico CUB  samples per class: $SHOTS"
  # train

  s=" train_glico.py --rn my_test --d conv --pixel --z_init rndm --resume --tr --data cub --dim 512 --epoch 202 --fewshot --shot ${SHOTS} --seed ${SEED}"
  python3 $s
  echo $s

  sleep 15

  # eval

  s="evaluation.py -d resnet50 --pretrained --keyword  cub_my_test_10unsuprvised_pixel_classifier_conv_tr_fs_${SHOTS}  --is_inter  --augment --epoch 200 --data cub  --fewshot --shot ${SHOTS} --dim 512 --seed ${SEED}"
  echo $s
  python3 $s
done

4. Multiple shots on CIFAR-100

UNLABELED=10
SEED=0


for SHOTS in 10 25 50 100; do
  echo "glico CIFAR100 samples per classt: $SHOTS"
  # train

  s="train_glico.py --rn  my_test_${UNLABELED}unsuprvised --fewshot --shot $SHOTS --d conv --pixel  --z_init rndm --resume --unlabeled_shot ${UNLABELED} --epoch 202 --noise_proj --tr --seed ${SEED} --dim 512"
  echo $s
  python3 $s

  sleep 15

  # eval

  s="evaluation.py -d wideresnet --keyword cifar-100_my_test_10unsuprvised_pixel_classifier_conv_tr_fs_${SHOTS}_ce_noise_proj --is_inter --augment --epoch 200 --data cifar --pretrained --fewshot --shot $SHOTS --unlabeled_shot ${UNLABELED} --loss_method ce --seed ${SEED} --dim 512"
  echo $s
  python3 $s
done

5. Baseline for CIFAR-100

  • Try the different flags:
    --random_erase
    --cutout
    --autoaugment
    or none of the above fro 'clean' baseline
  • Choose the classifier architecture from the following:
    --d widerenset
    --d resnet50
    --d resnet (resnet110)
    --d vgg (vgg19)
SHOTS=50
UNLABEL=1
SEED=0
echo " Baseline CIFAR random_erase shot: $SHOTS"
s=" baseline_classification.py --epoch 200 -d wideresnet --augment --data cifar  --fewshot --shot  $SHOTS --unlabeled_shot 10 --seed ${SEED}"
echo $s
python3 $s
echo " Baseline CIFAR random_erase shot: $SHOTS"

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].