All Projects → Dref360 → keras-transform

Dref360 / keras-transform

Licence: MIT license
Library for data augmentation

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to keras-transform

machine learning course
Artificial intelligence/machine learning course at UCF in Spring 2020 (Fall 2019 and Spring 2019)
Stars: ✭ 47 (+51.61%)
Mutual labels:  data-augmentation
ccgl
TKDE 22. CCCL: Contrastive Cascade Graph Learning.
Stars: ✭ 20 (-35.48%)
Mutual labels:  data-augmentation
KitanaQA
KitanaQA: Adversarial training and data augmentation for neural question-answering models
Stars: ✭ 58 (+87.1%)
Mutual labels:  data-augmentation
advchain
[Medical Image Analysis] Adversarial Data Augmentation with Chained Transformations (AdvChain)
Stars: ✭ 32 (+3.23%)
Mutual labels:  data-augmentation
SnapMix
SnapMix: Semantically Proportional Mixing for Augmenting Fine-grained Data (AAAI 2021)
Stars: ✭ 127 (+309.68%)
Mutual labels:  data-augmentation
EPCDepth
[ICCV 2021] Excavating the Potential Capacity of Self-Supervised Monocular Depth Estimation
Stars: ✭ 105 (+238.71%)
Mutual labels:  data-augmentation
ChineseNER
中文NER的那些事儿
Stars: ✭ 241 (+677.42%)
Mutual labels:  data-augmentation
audio degrader
Audio degradation toolbox in python, with a command-line tool. It is useful to apply controlled degradations to audio: e.g. data augmentation, evaluation in noisy conditions, etc.
Stars: ✭ 40 (+29.03%)
Mutual labels:  data-augmentation
PointCutMix
our code for paper 'PointCutMix: Regularization Strategy for Point Cloud Classification'
Stars: ✭ 42 (+35.48%)
Mutual labels:  data-augmentation
awesome-graph-self-supervised-learning
Awesome Graph Self-Supervised Learning
Stars: ✭ 805 (+2496.77%)
Mutual labels:  data-augmentation
specAugment
Tensor2tensor experiment with SpecAugment
Stars: ✭ 46 (+48.39%)
Mutual labels:  data-augmentation
CAPRICEP
An extended TSP (Time Stretched Pulse). CAPRICEP substantially replaces FVN. CAPRICEP enables interactive and real-time measurement of the linear time-invariant, the non-linear time-invariant, and random and time varying responses simultaneously.
Stars: ✭ 23 (-25.81%)
Mutual labels:  data-augmentation
MobilePose
Light-weight Single Person Pose Estimator
Stars: ✭ 588 (+1796.77%)
Mutual labels:  data-augmentation
GaNDLF
A generalizable application framework for segmentation, regression, and classification using PyTorch
Stars: ✭ 77 (+148.39%)
Mutual labels:  data-augmentation
manifold mixup
Tensorflow implementation of the Manifold Mixup machine learning research paper
Stars: ✭ 24 (-22.58%)
Mutual labels:  data-augmentation
mrnet
Building an ACL tear detector to spot knee injuries from MRIs with PyTorch (MRNet)
Stars: ✭ 98 (+216.13%)
Mutual labels:  data-augmentation
Learning-From-Rules
Implementation of experiments in paper "Learning from Rules Generalizing Labeled Exemplars" to appear in ICLR2020 (https://openreview.net/forum?id=SkeuexBtDr)
Stars: ✭ 46 (+48.39%)
Mutual labels:  data-augmentation
Unets
Implemenation of UNets for Lung Segmentation
Stars: ✭ 18 (-41.94%)
Mutual labels:  data-augmentation
GAug
AAAI'21: Data Augmentation for Graph Neural Networks
Stars: ✭ 139 (+348.39%)
Mutual labels:  data-augmentation
coursera-gan-specialization
Programming assignments and quizzes from all courses within the GANs specialization offered by deeplearning.ai
Stars: ✭ 277 (+793.55%)
Mutual labels:  data-augmentation

keras-transform

Library for data augmentation

ANNOUNCEMENT : I won't really work on this library anymore, the recent changes made with Keras 2.2.0 made this library obselete. Please see my blog post : https://dref360.github.io/deterministic-da/

This library provides a data augmentation pipeline for Sequence objects.

Keras-transform allows the user to specify a mask to do data augmentation in a flexible way. This is useful in many tasks like segmentation where we want the ground truth to be augmented. See simple.ipynb.

Keras-transform also works with multiple inputs, outputs by using complex masks. For example, mask=[[True,False],False] would augment the first input but not the second.

keras-transform in 10 lines

from transform.sequences import SequentialTransformer
from transform.sequences import RandomZoomTransformer, RandomVerticalFlipTransformer

seq = ... # A keras.utils.Sequence object that returns a tuple (X,y)
model = ... # A keras Model

"""
A transformer transforms the input. Most data augmentation functions are implemented in transform.sequences.
We can chain transformers together using the SequentialTransformer that takes a list of transformers.
"""
sequence = SequentialTransformer([RandomZoomTransformer(zoom_range=(0.8,1.2)),
                                  RandomVerticalFlipTransformer()])

# To augment X but not y
augmented_sequence = sequence(seq,mask=[True,False])
model.fit_generator(augmented_sequence,steps_per_epoch=len(augmented_sequence))

# To augment X and y
augmented_sequence = sequence(seq,mask=[True,True]) # Alternatively, mask=True would also work.
model.fit_generator(augmented_sequence,steps_per_epoch=len(augmented_sequence))

Contributing

Anyone can contribute by submitting a PR. Any PR that adds a new feature needs to be tested.

Example

Here's an example where X is an image and the ground truth is the grayscale version of the input. The code can be found here.

alt-text

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].