All Projects → tlatkowski → gans-2.0

tlatkowski / gans-2.0

Licence: MIT license
Generative Adversarial Networks in TensorFlow 2.0

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects
shell
77523 projects

Projects that are alternatives of or similar to gans-2.0

Pytorch Generative Model Collections
Collection of generative models in Pytorch version.
Stars: ✭ 2,296 (+2921.05%)
Mutual labels:  generative-adversarial-network, mnist, conditional-gan, fashion-mnist
TF2-GAN
🐳 GAN implemented as Tensorflow 2.X
Stars: ✭ 61 (-19.74%)
Mutual labels:  generative-adversarial-network, tensorflow-examples, tensorflow2
Awesome-Tensorflow2
基于Tensorflow2开发的优秀扩展包及项目
Stars: ✭ 45 (-40.79%)
Mutual labels:  tensorflow-models, tensorflow-examples, tensorflow2
Generative adversarial networks 101
Keras implementations of Generative Adversarial Networks. GANs, DCGAN, CGAN, CCGAN, WGAN and LSGAN models with MNIST and CIFAR-10 datasets.
Stars: ✭ 138 (+81.58%)
Mutual labels:  generative-adversarial-network, mnist, cifar10
pcdarts-tf2
PC-DARTS (PC-DARTS: Partial Channel Connections for Memory-Efficient Differentiable Architecture Search, published in ICLR 2020) implemented in Tensorflow 2.0+. This is an unofficial implementation.
Stars: ✭ 25 (-67.11%)
Mutual labels:  cifar10, cifar-10, tensorflow2
Relativistic Average Gan Keras
The implementation of Relativistic average GAN with Keras
Stars: ✭ 36 (-52.63%)
Mutual labels:  generative-adversarial-network, mnist, cifar10
Cyclegan Qp
Official PyTorch implementation of "Artist Style Transfer Via Quadratic Potential"
Stars: ✭ 59 (-22.37%)
Mutual labels:  generative-adversarial-network, style-transfer, cyclegan
Gannotation
GANnotation (PyTorch): Landmark-guided face to face synthesis using GANs (And a triple consistency loss!)
Stars: ✭ 167 (+119.74%)
Mutual labels:  generative-adversarial-network, cyclegan
CycleGAN-gluon-mxnet
this repo attemps to reproduce Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks(CycleGAN) use gluon reimplementation
Stars: ✭ 31 (-59.21%)
Mutual labels:  generative-adversarial-network, cyclegan
Age-Gender Estimation TF-Android
Age + Gender Estimation on Android with TensorFlow Lite
Stars: ✭ 34 (-55.26%)
Mutual labels:  tensorflow-models, tensorflow-examples
Awesome Tensorlayer
A curated list of dedicated resources and applications
Stars: ✭ 248 (+226.32%)
Mutual labels:  generative-adversarial-network, mnist
Wgan
Tensorflow Implementation of Wasserstein GAN (and Improved version in wgan_v2)
Stars: ✭ 228 (+200%)
Mutual labels:  generative-adversarial-network, tensorflow-models
Machine-Learning-Notebooks
15+ Machine/Deep Learning Projects in Ipython Notebooks
Stars: ✭ 66 (-13.16%)
Mutual labels:  tensorflow-models, cifar-10
Tensorflow Mnist Gan Dcgan
Tensorflow implementation of Generative Adversarial Networks (GAN) and Deep Convolutional Generative Adversarial Netwokrs for MNIST dataset.
Stars: ✭ 163 (+114.47%)
Mutual labels:  generative-adversarial-network, mnist
Tensorflow Infogan
🎎 InfoGAN: Interpretable Representation Learning
Stars: ✭ 149 (+96.05%)
Mutual labels:  generative-adversarial-network, mnist
Tsit
[ECCV 2020 Spotlight] A Simple and Versatile Framework for Image-to-Image Translation
Stars: ✭ 141 (+85.53%)
Mutual labels:  generative-adversarial-network, style-transfer
Pytorch Cyclegan And Pix2pix
Image-to-Image Translation in PyTorch
Stars: ✭ 16,477 (+21580.26%)
Mutual labels:  generative-adversarial-network, cyclegan
publications-arruda-ijcnn-2019
Cross-Domain Car Detection Using Unsupervised Image-to-Image Translation: From Day to Night
Stars: ✭ 59 (-22.37%)
Mutual labels:  generative-adversarial-network, cyclegan
pix2pix
This project uses a conditional generative adversarial network (cGAN) named Pix2Pix for the Image to image translation task.
Stars: ✭ 28 (-63.16%)
Mutual labels:  cyclegan, conditional-gan
3D-GuidedGradCAM-for-Medical-Imaging
This Repo containes the implemnetation of generating Guided-GradCAM for 3D medical Imaging using Nifti file in tensorflow 2.0. Different input files can be used in that case need to edit the input to the Guided-gradCAM model.
Stars: ✭ 60 (-21.05%)
Mutual labels:  tensorflow-examples, tensorflow2

Build Status codecov

GANs 2.0: Generative Adversarial Networks in TensorFlow 2.0

Project aim

The main aim of this project is to speed up a process of building deep learning pipelines that are based on Generative Adversarial Networks and simplify prototyping of various generator/discriminator models. This library provides several GAN trainers that can be used as off-the-shelf features such us:

  • Vanilla GAN
  • Conditional GAN
  • Cycle GAN
  • Wasserstein GAN
  • Progressive GAN (WIP)

Examples

Function modeling

Vanilla GAN (Gaussian function) Vanilla GAN (sigmoid function)
vanilla_mnist conditional_mnist

Image generation

Vanilla GAN (MNIST) Conditional GAN (MNIST)
vanilla_mnist conditional_mnist
Vanilla GAN (FASHION_MNIST) Conditional GAN (FASHION_MNIST)
vanilla_fashion_mnist conditional_fashion_mnist
Vanilla GAN (CIFAR10) Conditional GAN (CIFAR10)
vanilla_cifar10 conditional_cifar10

Image translation

Cycle GAN (SUMMER2WINTER) Cycle GAN (WINTER2SUMMER)
cycle_s2w cycle_w2s

Installation

Installs with GPU support

pip install gans2[tensorflow_gpu]

Installs with CPU support

pip install gans2[tensorflow]

Running training pipeline code examples for Vanilla GAN for MNIST digit generation

Pre-defined models

import tensorflow as tf
from easydict import EasyDict as edict

from gans.datasets import mnist
from gans.models.discriminators import discriminator
from gans.models.generators.latent_to_image import latent_to_image
from gans.trainers import optimizers
from gans.trainers import vanilla_gan_trainer

model_parameters = edict({
    'img_height':                  28,
    'img_width':                   28,
    'num_channels':                1,
    'batch_size':                  16,
    'num_epochs':                  10,
    'buffer_size':                 1000,
    'latent_size':                 100,
    'learning_rate_generator':     0.0001,
    'learning_rate_discriminator': 0.0001,
    'save_images_every_n_steps':   10
})

dataset = mnist.MnistDataset(model_parameters)

generator = latent_to_image.LatentToImageGenerator(model_parameters)
discriminator = discriminator.Discriminator(model_parameters)

generator_optimizer = optimizers.Adam(
    learning_rate=model_parameters.learning_rate_generator,
    beta_1=0.5,
)
discriminator_optimizer = optimizers.Adam(
    learning_rate=model_parameters.learning_rate_discriminator,
    beta_1=0.5,
)

gan_trainer = vanilla_gan_trainer.VanillaGANTrainer(
    batch_size=model_parameters.batch_size,
    generator=generator,
    discriminator=discriminator,
    training_name='VANILLA_GAN_MNIST',
    generator_optimizer=generator_optimizer,
    discriminator_optimizer=discriminator_optimizer,
    latent_size=model_parameters.latent_size,
    continue_training=False,
    save_images_every_n_steps=model_parameters.save_images_every_n_steps,
    visualization_type='image',
)

gan_trainer.train(
    dataset=dataset,
    num_epochs=model_parameters.num_epochs,
)

Custom models

import tensorflow as tf
from easydict import EasyDict as edict
from tensorflow.python import keras
from tensorflow.python.keras import layers

from gans.datasets import mnist
from gans.models import sequential
from gans.trainers import optimizers
from gans.trainers import vanilla_gan_trainer

model_parameters = edict({
    'img_height':                  28,
    'img_width':                   28,
    'num_channels':                1,
    'batch_size':                  16,
    'num_epochs':                  10,
    'buffer_size':                 1000,
    'latent_size':                 100,
    'learning_rate_generator':     0.0001,
    'learning_rate_discriminator': 0.0001,
    'save_images_every_n_steps':   10
})

dataset = mnist.MnistDataset(model_parameters)

generator = sequential.SequentialModel(
    layers=[
        keras.Input(shape=[model_parameters.latent_size]),
        layers.Dense(units=7 * 7 * 256, use_bias=False),
        layers.BatchNormalization(),
        layers.LeakyReLU(),

        layers.Reshape((7, 7, 256)),
        layers.Conv2DTranspose(128, (5, 5), strides=(1, 1), padding='same', use_bias=False),
        layers.BatchNormalization(),
        layers.LeakyReLU(),

        layers.Conv2DTranspose(64, (5, 5), strides=(2, 2), padding='same', use_bias=False),
        layers.BatchNormalization(),
        layers.LeakyReLU(),

        layers.Conv2DTranspose(1, (5, 5), strides=(2, 2), padding='same', use_bias=False, activation='tanh')
    ]
)

discriminator = sequential.SequentialModel(
    [
        keras.Input(
            shape=[
                model_parameters.img_height,
                model_parameters.img_width,
                model_parameters.num_channels,
            ]),
        layers.Conv2D(filters=64, kernel_size=(5, 5), strides=(2, 2), padding='same'),
        layers.LeakyReLU(),
        layers.Dropout(0.3),

        layers.Conv2D(filters=128, kernel_size=(5, 5), strides=(2, 2), padding='same'),
        layers.LeakyReLU(),
        layers.Dropout(rate=0.3),

        layers.Flatten(),
        layers.Dense(units=1),
    ]
)

generator_optimizer = optimizers.Adam(
    learning_rate=model_parameters.learning_rate_generator,
    beta_1=0.5,
)
discriminator_optimizer = optimizers.Adam(
    learning_rate=model_parameters.learning_rate_discriminator,
    beta_1=0.5,
)

gan_trainer = vanilla_gan_trainer.VanillaGANTrainer(
    batch_size=model_parameters.batch_size,
    generator=generator,
    discriminator=discriminator,
    training_name='VANILLA_GAN_MNIST_CUSTOM_MODELS',
    generator_optimizer=generator_optimizer,
    discriminator_optimizer=discriminator_optimizer,
    latent_size=model_parameters.latent_size,
    continue_training=False,
    save_images_every_n_steps=model_parameters.save_images_every_n_steps,
    visualization_type='image',
)

gan_trainer.train(
    dataset=dataset,
    num_epochs=model_parameters.num_epochs,
)

More code examples

Vanilla GAN for Gaussian function modeling

Vanilla GAN for sigmoid function modeling

Conditional GAN for MNIST digit generation

Cycle GAN for summer2winter/winter2summer style transfer

Wasserstein GAN for MNIST digit generatio

Monitoring model training

In order to visualize a training process (loss values, generated outputs) run the following command in the project directory:

tensorboard --logdir outputs

To follow the training process go to the default browser and type the following address http://your-workstation-name:6006/

The below picture presents the TensorBoard view lunched for two experiments: VANILLA_MNIST and VANILLA_FASION_MNIST.

References

  1. Deep Convolutional Generative Adversarial Network Tutorial in TensorFlow
  2. Cycle GAN Tutorial in TensorFlow
  3. Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks paper
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].