All Projects → yu4u → Mixup Generator

yu4u / Mixup Generator

Licence: mit
An implementation of "mixup: Beyond Empirical Risk Minimization"

Projects that are alternatives of or similar to Mixup Generator

All Classifiers 2019
A collection of computer vision projects for Acute Lymphoblastic Leukemia classification/early detection.
Stars: ✭ 22 (-91.2%)
Mutual labels:  jupyter-notebook, data-augmentation, deep-neural-networks
Dab
Data Augmentation by Backtranslation (DAB) ヽ( •_-)ᕗ
Stars: ✭ 294 (+17.6%)
Mutual labels:  jupyter-notebook, data-augmentation, deep-neural-networks
Solt
Streaming over lightweight data transformations
Stars: ✭ 249 (-0.4%)
Mutual labels:  jupyter-notebook, data-augmentation
Andrew Ng Notes
This is Andrew NG Coursera Handwritten Notes.
Stars: ✭ 180 (-28%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Traffic Sign Detection
Traffic Sign Detection. Code for the paper entitled "Evaluation of deep neural networks for traffic sign detection systems".
Stars: ✭ 200 (-20%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Text Emotion Classification
Archived - not answering issues
Stars: ✭ 165 (-34%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Timesynth
A Multipurpose Library for Synthetic Time Series Generation in Python
Stars: ✭ 170 (-32%)
Mutual labels:  jupyter-notebook, generator
Pytorch Geometric Yoochoose
This is a tutorial for PyTorch Geometric on the YooChoose dataset
Stars: ✭ 198 (-20.8%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Starnet
StarNet
Stars: ✭ 141 (-43.6%)
Mutual labels:  jupyter-notebook, deep-neural-networks
50 Days Of Ml
A day to day plan for this challenge (50 Days of Machine Learning) . Covers both theoretical and practical aspects
Stars: ✭ 218 (-12.8%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Learnopencv
Learn OpenCV : C++ and Python Examples
Stars: ✭ 15,385 (+6054%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Cardio
CardIO is a library for data science research of heart signals
Stars: ✭ 218 (-12.8%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Hey Jetson
Deep Learning based Automatic Speech Recognition with attention for the Nvidia Jetson.
Stars: ✭ 161 (-35.6%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Applied Deep Learning With Tensorflow
Learn applied deep learning from zero to deployment using TensorFlow 1.8+
Stars: ✭ 160 (-36%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Deep Math Machine Learning.ai
A blog which talks about machine learning, deep learning algorithms and the Math. and Machine learning algorithms written from scratch.
Stars: ✭ 173 (-30.8%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Multihead Siamese Nets
Implementation of Siamese Neural Networks built upon multihead attention mechanism for text semantic similarity task.
Stars: ✭ 144 (-42.4%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Germanwordembeddings
Toolkit to obtain and preprocess german corpora, train models using word2vec (gensim) and evaluate them with generated testsets
Stars: ✭ 189 (-24.4%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Dlwpt Code
Code for the book Deep Learning with PyTorch by Eli Stevens, Luca Antiga, and Thomas Viehmann.
Stars: ✭ 3,054 (+1121.6%)
Mutual labels:  jupyter-notebook, deep-neural-networks
Copy Paste Aug
Copy-paste augmentation for segmentation and detection tasks
Stars: ✭ 132 (-47.2%)
Mutual labels:  jupyter-notebook, data-augmentation
Glasses
High-quality Neural Networks for Computer Vision 😎
Stars: ✭ 138 (-44.8%)
Mutual labels:  jupyter-notebook, deep-neural-networks

mixup generator

This is an implementation of the mixup algorithm.

Mixup

Mixup [1] is a kind of image augmentation methods, which augments training data by mixing-up both of training images and labels by linear interpolation with weight lambda:

X = lambda * X1 + (1 - lambda) * X2,
y = lambda * y1 + (1 - lambda) * y2,

where lambda is drawn from the Beta distribution Be(alpha, alpha), and alpha is a hyperparameter.

Please check mixup_generator.py for implementation details.

Usage

Get MixupGenerator:

from mixup_generator import MixupGenerator
training_generator = MixupGenerator(x_train, y_train, batch_size=batch_size, alpha=0.2)()
  • x_train : training images (#images x h x w x c)
  • y_train : labels as one hot vectors (#images x #classes or list of #images x #classes for multi-task training)
  • batch_size : batch size
  • alpha : hyper parameter; lambda is drawn from the beta distribution Be(alpha, alpha).

Get mixed training batch:

x, y = next(generator)

Please refer to test_mixup.ipynb to see how it works.

Use MixupGenerator with fit_generator in Keras

It is very easy to use MixupGenerator in training if you are using Keras; Get MixupGenerator, and then fit model by fit_generator:

model.fit_generator(generator=training_generator,
                    steps_per_epoch=x_train.shape[0] // batch_size,
                    validation_data=(x_test, y_test),
                    epochs=epochs, verbose=1,
                    callbacks=callbacks)

Please refer to cifar10_resnet.py for complete example, which is imported from official Keras examples.

MixupGenerator with ImageDataGenerator

The MixupGenerator can be combined with keras.preprocessing.image.ImageDataGenerator for further image augmentation:

datagen = ImageDataGenerator(
    width_shift_range=0.1,
    height_shift_range=0.1,
    horizontal_flip=True)
    
training_generator = MixupGenerator(x_train, y_train, batch_size=batch_size, alpha=0.2, datagen=datagen)()

In this case, the mixed-up training images are further augmented by ImageDataGenerator.

Mixup with Random Erasing

Random Erasing [2] is a kind of image augmentation methods for convolutional neural networks (CNN). It tries to regularize models using training images that are randomly masked with random values.

Please refer to this repository for the details of algorithm and its implementation.

Mixup can be combined with Random Erasing via ImageDataGenerator by:

from random_eraser import get_random_eraser

datagen = ImageDataGenerator(
    width_shift_range=0.1,
    height_shift_range=0.1,
    horizontal_flip=True,
    preprocessing_function=get_random_eraser(v_l=0, v_h=255))

generator = MixupGenerator(x_train, y_train, alpha=1.0, datagen=datagen)()

The augmented images become like this:

Results

(!Only a single trial)

Without mixup:

Test loss: 0.862150103855
Test accuracy: 0.8978

With mixup alpha = 0.2:

Test loss: 0.510702615929
Test accuracy: 0.9117

With mixup alpha = 0.5:

Test loss: 0.48489781661
Test accuracy: 0.9181

With mixup alpha = 1.0:

Test loss: 0.493033925915
Test accuracy: 0.9167

References

[1] H. Zhang, M. Cisse, Y. N. Dauphin, and D. Lopez-Paz, "mixup: Beyond Empirical Risk Minimization," in arXiv:1710.09412, 2017.

[2] Z. Zhong, L. Zheng, G. Kang, S. Li, and Y. Yang, "Random Erasing Data Augmentation," in arXiv:1708.04896, 2017.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].