All Projects → wchen342 → Sketchygan

wchen342 / Sketchygan

Licence: mit
Code for paper "SketchyGAN: Towards Diverse and Realistic Sketch to Image Synthesis"

Projects that are alternatives of or similar to Sketchygan

Doppelganger
[IMC 2020 (Best Paper Finalist)] Using GANs for Sharing Networked Time Series Data: Challenges, Initial Promise, and Open Questions
Stars: ✭ 97 (-14.16%)
Mutual labels:  generative-adversarial-network
Deligan
This project is an implementation of the Generative Adversarial Network proposed in our CVPR 2017 paper - DeLiGAN : Generative Adversarial Networks for Diverse and Limited Data. DeLiGAN is a simple but effective modification of the GAN framework and aims to improve performance on datasets which are diverse yet small in size.
Stars: ✭ 103 (-8.85%)
Mutual labels:  generative-adversarial-network
Exermote
Using Machine Learning to predict the type of exercise from movement data
Stars: ✭ 108 (-4.42%)
Mutual labels:  generative-adversarial-network
Lggan
[CVPR 2020] Local Class-Specific and Global Image-Level Generative Adversarial Networks for Semantic-Guided Scene Generation
Stars: ✭ 97 (-14.16%)
Mutual labels:  generative-adversarial-network
Spectralnormalizationkeras
Spectral Normalization for Keras Dense and Convolution Layers
Stars: ✭ 100 (-11.5%)
Mutual labels:  generative-adversarial-network
Faceaging By Cyclegan
Stars: ✭ 105 (-7.08%)
Mutual labels:  generative-adversarial-network
Sprint gan
Privacy-preserving generative deep neural networks support clinical data sharing
Stars: ✭ 92 (-18.58%)
Mutual labels:  generative-adversarial-network
Textsum Gan
Tensorflow re-implementation of GAN for text summarization
Stars: ✭ 111 (-1.77%)
Mutual labels:  generative-adversarial-network
Gaal Based Outlier Detection
GAAL-based Outlier Detection
Stars: ✭ 102 (-9.73%)
Mutual labels:  generative-adversarial-network
Stylegan2 Projecting Images
Projecting images to latent space with StyleGAN2.
Stars: ✭ 102 (-9.73%)
Mutual labels:  generative-adversarial-network
3d Recgan Extended
🔥3D-RecGAN++ in Tensorflow (TPAMI 2018)
Stars: ✭ 98 (-13.27%)
Mutual labels:  generative-adversarial-network
Lsd Seg
Learning from Synthetic Data: Addressing Domain Shift for Semantic Segmentation
Stars: ✭ 99 (-12.39%)
Mutual labels:  generative-adversarial-network
Pixel2style2pixel
Official Implementation for "Encoding in Style: a StyleGAN Encoder for Image-to-Image Translation"
Stars: ✭ 1,395 (+1134.51%)
Mutual labels:  generative-adversarial-network
Tagan
An official PyTorch implementation of the paper "Text-Adaptive Generative Adversarial Networks: Manipulating Images with Natural Language", NeurIPS 2018
Stars: ✭ 97 (-14.16%)
Mutual labels:  generative-adversarial-network
Pytorch Gan Timeseries
GANs for time series generation in pytorch
Stars: ✭ 109 (-3.54%)
Mutual labels:  generative-adversarial-network
Porousmediagan
Reconstruction of three-dimensional porous media using generative adversarial neural networks
Stars: ✭ 94 (-16.81%)
Mutual labels:  generative-adversarial-network
Natsr
Natural and Realistic Single Image Super-Resolution with Explicit Natural Manifold Discrimination (CVPR, 2019)
Stars: ✭ 105 (-7.08%)
Mutual labels:  generative-adversarial-network
Gpnd
Generative Probabilistic Novelty Detection with Adversarial Autoencoders
Stars: ✭ 112 (-0.88%)
Mutual labels:  generative-adversarial-network
Stylegan Web
A web porting for NVlabs' StyleGAN.
Stars: ✭ 112 (-0.88%)
Mutual labels:  generative-adversarial-network
Unet Stylegan2
A Pytorch implementation of Stylegan2 with UNet Discriminator
Stars: ✭ 106 (-6.19%)
Mutual labels:  generative-adversarial-network

SketchyGAN: Towards Diverse and Realistic Sketch to Image Synthesis

Code for "SketchyGAN: Towards Diverse and Realistic Sketch to Image Synthesis".

This repository has been moved to https://git.droidware.info/wchen342/SketchyGAN.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].