All Projects → leocvml → CycleGAN-gluon-mxnet

leocvml / CycleGAN-gluon-mxnet

Licence: other
this repo attemps to reproduce Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks(CycleGAN) use gluon reimplementation

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to CycleGAN-gluon-mxnet

Cyclegan
Software that can generate photos from paintings, turn horses into zebras, perform style transfer, and more.
Stars: ✭ 10,933 (+35167.74%)
Mutual labels:  computer-graphics, generative-adversarial-network, cyclegan
Contrastive Unpaired Translation
Contrastive unpaired image-to-image translation, faster and lighter training than cyclegan (ECCV 2020, in PyTorch)
Stars: ✭ 822 (+2551.61%)
Mutual labels:  computer-graphics, generative-adversarial-network, cyclegan
Pytorch Cyclegan
A clean and readable Pytorch implementation of CycleGAN
Stars: ✭ 558 (+1700%)
Mutual labels:  computer-graphics, generative-adversarial-network, cyclegan
Pytorch Cyclegan And Pix2pix
Image-to-Image Translation in PyTorch
Stars: ✭ 16,477 (+53051.61%)
Mutual labels:  computer-graphics, generative-adversarial-network, cyclegan
Quantization.mxnet
Simulate quantization and quantization aware training for MXNet-Gluon models.
Stars: ✭ 42 (+35.48%)
Mutual labels:  mxnet, gluon
Gluonrank
Ranking made easy
Stars: ✭ 39 (+25.81%)
Mutual labels:  mxnet, gluon
Ko en neural machine translation
Korean English NMT(Neural Machine Translation) with Gluon
Stars: ✭ 55 (+77.42%)
Mutual labels:  mxnet, gluon
Mxnet Im2rec tutorial
this simple tutorial will introduce how to use im2rec for mx.image.ImageIter , ImageDetIter and how to use im2rec for COCO DataSet
Stars: ✭ 97 (+212.9%)
Mutual labels:  mxnet, gluon
Aws Machine Learning University Accelerated Tab
Machine Learning University: Accelerated Tabular Data Class
Stars: ✭ 718 (+2216.13%)
Mutual labels:  mxnet, gluon
Gluon2pytorch
Gluon to PyTorch deep neural network model converter
Stars: ✭ 70 (+125.81%)
Mutual labels:  mxnet, gluon
ResidualAttentionNetwork
A Gluon implement of Residual Attention Network. Best acc on cifar10-97.78%.
Stars: ✭ 104 (+235.48%)
Mutual labels:  mxnet, gluon
Sockeye
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
Stars: ✭ 990 (+3093.55%)
Mutual labels:  mxnet, gluon
Efficientnet
Gluon implementation of EfficientNet and EfficientNet-lite
Stars: ✭ 30 (-3.23%)
Mutual labels:  mxnet, gluon
Aws Machine Learning University Accelerated Cv
Machine Learning University: Accelerated Computer Vision Class
Stars: ✭ 1,068 (+3345.16%)
Mutual labels:  mxnet, gluon
Mxnet Centernet
Gluon implementation of "Objects as Points", aka "CenterNet"
Stars: ✭ 29 (-6.45%)
Mutual labels:  mxnet, gluon
Mxnet Gluon Syncbn
MXNet Gluon Synchronized Batch Normalization Preview
Stars: ✭ 78 (+151.61%)
Mutual labels:  mxnet, gluon
Mxnet.sharp
.NET Standard bindings for Apache MxNet with Imperative, Symbolic and Gluon Interface for developing, training and deploying Machine Learning models in C#. https://mxnet.tech-quantum.com/
Stars: ✭ 134 (+332.26%)
Mutual labels:  mxnet, gluon
Aws Machine Learning University Accelerated Nlp
Machine Learning University: Accelerated Natural Language Processing Class
Stars: ✭ 1,695 (+5367.74%)
Mutual labels:  mxnet, gluon
Single Path One Shot Nas Mxnet
Single Path One-Shot NAS MXNet implementation with full training and searching pipeline. Support both Block and Channel Selection. Searched models better than the original paper are provided.
Stars: ✭ 136 (+338.71%)
Mutual labels:  mxnet, gluon
Gluon Nlp
NLP made easy
Stars: ✭ 2,344 (+7461.29%)
Mutual labels:  mxnet, gluon

CycleGAN-gluon-mxnet

this repo attemps to reproduce Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks(CycleGAN) use gluon reimplementation

Quick start

  1. download dataset (my sample is samll set apple <-> orange(each domain have 45 images))

you can download complete dataset from this link dataset website

  1. train
  2. inference ( weighting is trained by complete apple2orange dataset)

Requirements

mxnet 1.1.0

Abstract

Image to image translation : learn the mapping between an input image and an output image using a training set of aligned image pair.However for many tasks, paired training data will not be available.

present an approach for learning translate an image from a source domain X to a target domain Y in the absence of paired examples.

Network architecture

generate

The network consists of:
c7s1-32,d64,d128,R128,R128,R128,
R128,R128,R128,R128,R128,R128,u64,u32,c7s1-3
class Generator_256(gluon.nn.HybridBlock):
    def __init__(self):
        super(Generator_256, self).__init__()
        self.net = nn.HybridSequential()
        with self.net.name_scope():
            self.net.add(
                nn.ReflectionPad2D(3),
                nn.Conv2D(32, kernel_size=7, strides=1),
                nn.InstanceNorm(),
                nn.Activation('relu'),  #c7s1-32
                conv_inst_relu(64),
                conv_inst_relu(128),
            )
            for _ in range(9):
                self.net.add(
                        ResBlock(128)
                )
            self.net.add(
                upconv_inst_relu(64),
                upconv_inst_relu(32),
                nn.ReflectionPad2D(3),
                nn.Conv2D(3,kernel_size=7,strides=1),
                nn.Activation('sigmoid')
            )

    def hybrid_forward(self, F, x):
        return self.net(x)

discriminator

use kernel size = 3
class Discriminator(gluon.nn.HybridBlock):
    def __init__(self):
        super(Discriminator, self).__init__()
        self.net = nn.HybridSequential()
        with self.net.name_scope():
            self.net.add(
                nn.Conv2D(64, kernel_size=3,strides=2,padding=1),
                nn.LeakyReLU(0.2),
                nn.Conv2D(128, kernel_size=3,strides=2,padding=1),
                nn.InstanceNorm(),
                nn.LeakyReLU(0.2),
                nn.Conv2D(256, kernel_size=3,strides=2,padding=1),
                nn.InstanceNorm(),
                nn.LeakyReLU(0.2),
                nn.Conv2D(512, kernel_size=3,strides=2,padding=1),
                nn.InstanceNorm(),
                nn.LeakyReLU(0.2),
                nn.Conv2D(1,kernel_size=1,strides=1),
            )
    def  hybrid_forward(self, F, x):
        return self.net(x)

training step

train Discriminator

  1. Da aims to distinguish between translated samples G(B) and real smaples A
with autograd.record():   # train A

    real_A = D_A(A)  # distinguish real image A
    fake_BA = G_BA(B) #generate fake A image from B
    fake_A = D_A(fake_BA)# distinguish fake image

    real_label = nd.ones_like(real_A,ctx=ctx)
    fake_label = nd.zeros_like(fake_A,ctx=ctx)


    errA_real = L2_loss(real_A, real_label)
    errA_fake = L2_loss(fake_A, fake_label)
    errDA = (errA_real + errA_fake) * 0.5
errDA.backward()
DA_trainer.step(A.shape[0])

train generate

  1. generate fake_B from domain A
  2. generate reconstruct A from fake B
  3. calculate cycle consistency loss(recA,A) (lamda =10)
  4. use fake_B image fool disciminator B
with autograd.record():
    fake_AB = G_AB(A)
    fake_A = G_BA(fake_AB)
    cycA_loss = cyc_loss(fake_A,A)

    fake_B = D_B(fake_AB)


    errG_AB = L2_loss(fake_B,real_label) + lamba * cycA_loss
    errG_AB.backward()
GAB_trainer.step(A.shape[0])

Result

apple2Orange

Orange2apple

Reference

https://github.com/junyanz/CycleGAN thanks author propose this new method

and thanks gluon, mxnet team give us this wonderful tool, we can reimplement project very quickly thanks

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].