All Projects → Shaoli-Huang → SnapMix

Shaoli-Huang / SnapMix

Licence: other
SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data (AAAI 2021)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to SnapMix

DataAugmentationTF
Implementation of modern data augmentation techniques in TensorFlow 2.x to be used in your training pipeline.
Stars: ✭ 35 (-72.44%)
Mutual labels:  data-augmentation, mixup, cutmix
mixup
speechpro.com/
Stars: ✭ 23 (-81.89%)
Mutual labels:  data-augmentation, mixup
Tensorflow Mnist Cnn
MNIST classification using Convolutional NeuralNetwork. Various techniques such as data augmentation, dropout, batchnormalization, etc are implemented.
Stars: ✭ 182 (+43.31%)
Mutual labels:  data-augmentation
ChineseNER
中文NER的那些事儿
Stars: ✭ 241 (+89.76%)
Mutual labels:  data-augmentation
Muda
A library for augmenting annotated audio data
Stars: ✭ 177 (+39.37%)
Mutual labels:  data-augmentation
Face.evolve.pytorch
🔥🔥High-Performance Face Recognition Library on PaddlePaddle & PyTorch🔥🔥
Stars: ✭ 2,719 (+2040.94%)
Mutual labels:  data-augmentation
mrnet
Building an ACL tear detector to spot knee injuries from MRIs with PyTorch (MRNet)
Stars: ✭ 98 (-22.83%)
Mutual labels:  data-augmentation
proxy-synthesis
Official PyTorch implementation of "Proxy Synthesis: Learning with Synthetic Classes for Deep Metric Learning" (AAAI 2021)
Stars: ✭ 30 (-76.38%)
Mutual labels:  aaai2021
advchain
[Medical Image Analysis] Adversarial Data Augmentation with Chained Transformations (AdvChain)
Stars: ✭ 32 (-74.8%)
Mutual labels:  data-augmentation
Mixup Generator
An implementation of "mixup: Beyond Empirical Risk Minimization"
Stars: ✭ 250 (+96.85%)
Mutual labels:  data-augmentation
Solt
Streaming over lightweight data transformations
Stars: ✭ 249 (+96.06%)
Mutual labels:  data-augmentation
Syndata Generation
Code used to generate synthetic scenes and bounding box annotations for object detection. This was used to generate data used in the Cut, Paste and Learn paper
Stars: ✭ 214 (+68.5%)
Mutual labels:  data-augmentation
machine learning course
Artificial intelligence/machine learning course at UCF in Spring 2020 (Fall 2019 and Spring 2019)
Stars: ✭ 47 (-62.99%)
Mutual labels:  data-augmentation
Scaper
A library for soundscape synthesis and augmentation
Stars: ✭ 186 (+46.46%)
Mutual labels:  data-augmentation
specAugment
Tensor2tensor experiment with SpecAugment
Stars: ✭ 46 (-63.78%)
Mutual labels:  data-augmentation
Tsaug
A Python package for time series augmentation
Stars: ✭ 180 (+41.73%)
Mutual labels:  data-augmentation
Zeroth
Kaldi-based Korean ASR (한국어 음성인식) open-source project
Stars: ✭ 248 (+95.28%)
Mutual labels:  data-augmentation
deep utils
An open-source toolkit which is full of handy functions, including the most used models and utilities for deep-learning practitioners!
Stars: ✭ 73 (-42.52%)
Mutual labels:  cutmix
CAPRICEP
An extended TSP (Time Stretched Pulse). CAPRICEP substantially replaces FVN. CAPRICEP enables interactive and real-time measurement of the linear time-invariant, the non-linear time-invariant, and random and time varying responses simultaneously.
Stars: ✭ 23 (-81.89%)
Mutual labels:  data-augmentation
Image-Rotation-and-Cropping-tensorflow
Image rotation and cropping out the black borders in TensorFlow
Stars: ✭ 14 (-88.98%)
Mutual labels:  data-augmentation

SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data (AAAI 2021)

PyTorch implementation of SnapMix | paper

Method Overview

SnapMix

Cite

@inproceedings{huang2021snapmix,
    title={SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data},
    author={Shaoli Huang, Xinchao Wang, and Dacheng Tao},
    year={2021},
    booktitle={AAAI Conference on Artificial Intelligence},
}

Setup

Install Package Dependencies

torch
torchvision 
PyYAML
easydict
tqdm
scikit-learn
efficientnet_pytorch
pandas
opencv

Datasets

create a soft link to the dataset directory

CUB dataset

ln -s /your-path-to/CUB-dataset data/cub

Car dataset

ln -s /your-path-to/Car-dataset data/car

Aircraft dataset

ln -s /your-path-to/Aircraft-dataset data/aircraft

Training

Training with Imagenet pre-trained weights

1. Baseline and Baseline+

To train a model on CUB dataset using the Resnet-50 backbone,

python main.py # baseline

python main.py --midlevel # baseline+

To train model on other datasets using other network backbones, you can specify the following arguments:

--netname: name of network architectures (support 4 network families: ResNet,DenseNet,InceptionV3,EfficientNet)

--dataset: dataset name

For example,

python main.py --netname resnet18 --dataset cub # using the Resnet-18 backbone on CUB dataset

python main.py --netname efficientnet-b0 --dataset cub # using the EfficientNet-b0 backbone on CUB dataset

python main.py --netname inceptoinV3 --dataset aircraft # using the inceptionV3 backbone on Aircraft dataset

2. Training with mixing augmentation

Applying SnapMix in training ( we used the hyperparameter values (prob=1., beta=5) for SnapMix in most of the experiments.):

python main.py --mixmethod snapmix --beta 5 --netname resnet50 --dataset cub # baseline

python main.py --mixmethod snapmix --beta 5 --netname resnet50 --dataset cub --midlevel # baseline+

Applying other augmentation methods (currently support cutmix,cutout,and mixup) in training:

python main.py --mixmethod cutmix --beta 3 --netname resnet50 --dataset cub # training with CutMix

python main.py --mixmethod mixup --prob 0.5 --netname resnet50 --dataset cub # training with MixUp

3. Results

ResNet architecture.

Backbone Method CUB Car Aircraft
Resnet-18 Baseline 82.35% 91.15% 87.80%
Resnet-18 Baseline + SnapMix 84.29% 93.12% 90.17%
Resnet-34 Baseline 84.98% 92.02% 89.92%
Resnet-34 Baseline + SnapMix 87.06% 93.95% 92.36%
Resnet-50 Baseline 85.49% 93.04% 91.07%
Resnet-50 Baseline + SnapMix 87.75% 94.30% 92.08%
Resnet-101 Baseline 85.62% 93.09% 91.59%
Resnet-101 Baseline + SnapMix 88.45% 94.44% 93.74%
Resnet-50 Baseline+ 87.13% 93.80% 91.68%
Resnet-50 Baseline+ + SnapMix 88.70% 95.00% 93.24%
Resnet-101 Baseline+ 87.81% 93.94% 91.85%
Resnet-101 Baseline+ + SnapMix 89.32% 94.84% 94.05%

InceptionV3 architecture.

Backbone Method CUB
InceptionV3 Baseline 82.22%
InceptionV3 Baseline + SnapMix 85.54%

DenseNet architecture.

Backbone Method CUB
DenseNet121 Baseline 84.23%
DenseNet121 Baseline + SnapMix 87.42%

Training from scratch

To train a model without using ImageNet pretrained weights:

python main.py --mixmethod snapmix --prob 0.5 --netname resnet18 --dataset cub --pretrained 0 # resnet-18 backbone

python main.py --mixmethod snapmix --prob 0.5 --netname resnet50 --dataset cub --pretrained 0 # resnet-50 backbone

2. Results

Backbone Method CUB
Resnet-18 Baseline 64.98%
Resnet-18 Baseline + SnapMix 70.31%
Resnet-50 Baseline 66.92%
Resnet-50 Baseline + SnapMix 72.17%
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].