All Projects → seasonSH → Warpgan

seasonSH / Warpgan

Licence: mit
(CVPR 2019 Oral) Style Transfer with Geometric Deformation

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Warpgan

CS231n
My solutions for Assignments of CS231n: Convolutional Neural Networks for Visual Recognition
Stars: ✭ 30 (-86.05%)
Mutual labels:  gan, style-transfer
Hidt
Official repository for the paper "High-Resolution Daytime Translation Without Domain Labels" (CVPR2020, Oral)
Stars: ✭ 513 (+138.6%)
Mutual labels:  gan, style-transfer
HistoGAN
Reference code for the paper HistoGAN: Controlling Colors of GAN-Generated and Real Images via Color Histograms (CVPR 2021).
Stars: ✭ 158 (-26.51%)
Mutual labels:  gan, style-transfer
CoMoGAN
CoMoGAN: continuous model-guided image-to-image translation. CVPR 2021 oral.
Stars: ✭ 139 (-35.35%)
Mutual labels:  gan, style-transfer
Sketch To Art
🖼 Create artwork from your casual sketch with GAN and style transfer
Stars: ✭ 115 (-46.51%)
Mutual labels:  gan, style-transfer
TET-GAN
[AAAI 2019] TET-GAN: Text Effects Transfer via Stylization and Destylization
Stars: ✭ 74 (-65.58%)
Mutual labels:  gan, style-transfer
Few Shot Patch Based Training
The official implementation of our SIGGRAPH 2020 paper Interactive Video Stylization Using Few-Shot Patch-Based Training
Stars: ✭ 313 (+45.58%)
Mutual labels:  gan, style-transfer
CariMe-pytorch
Unpaired Caricature Generation with Multiple Exaggerations (TMM 2021)
Stars: ✭ 33 (-84.65%)
Mutual labels:  gan, style-transfer
Cyclegan Music Style Transfer
Symbolic Music Genre Transfer with CycleGAN
Stars: ✭ 201 (-6.51%)
Mutual labels:  gan, style-transfer
Neural Painters X
Neural Paiters
Stars: ✭ 61 (-71.63%)
Mutual labels:  gan, style-transfer
Zhihu
This repo contains the source code in my personal column (https://zhuanlan.zhihu.com/zhaoyeyu), implemented using Python 3.6. Including Natural Language Processing and Computer Vision projects, such as text generation, machine translation, deep convolution GAN and other actual combat code.
Stars: ✭ 3,307 (+1438.14%)
Mutual labels:  gan, style-transfer
Tsit
[ECCV 2020 Spotlight] A Simple and Versatile Framework for Image-to-Image Translation
Stars: ✭ 141 (-34.42%)
Mutual labels:  gan, style-transfer
Cyclegan Qp
Official PyTorch implementation of "Artist Style Transfer Via Quadratic Potential"
Stars: ✭ 59 (-72.56%)
Mutual labels:  gan, style-transfer
Deep Learning With Python
Example projects I completed to understand Deep Learning techniques with Tensorflow. Please note that I do no longer maintain this repository.
Stars: ✭ 134 (-37.67%)
Mutual labels:  gan, style-transfer
Cartoonify
Deploy and scale serverless machine learning app - in 4 steps.
Stars: ✭ 157 (-26.98%)
Mutual labels:  gan, style-transfer
Munit
Multimodal Unsupervised Image-to-Image Translation
Stars: ✭ 2,404 (+1018.14%)
Mutual labels:  gan
Papers
Summaries of machine learning papers
Stars: ✭ 2,362 (+998.6%)
Mutual labels:  gan
Arbitrary Text To Image Papers
A collection of arbitrary text to image papers with code (constantly updating)
Stars: ✭ 196 (-8.84%)
Mutual labels:  gan
Freezed
Freeze the Discriminator: a Simple Baseline for Fine-Tuning GANs (CVPRW 2020)
Stars: ✭ 195 (-9.3%)
Mutual labels:  gan
Style transfer
CNN image style transfer 🎨.
Stars: ✭ 210 (-2.33%)
Mutual labels:  style-transfer

WarpGAN: Automatic Caricature Generation

By Yichun Shi, Debayan Deb and Anil K. Jain

A tensorflow implementation of WarpGAN, a fully automatic network that can generate caricatures given an input face photo. Besides transferring rich texture styles, WarpGAN learns to automatically predict a set of control points that can warp the photo into a caricature, while preserving identity. We introduce an identity-preserving adversarial loss that aids the discriminator to distinguish between different subjects. Moreover, WarpGAN allows customization of the generated caricatures by controlling the exaggeration extent and the visual styles.

Tensorflow release

Currently this repo is compatible with Tensorflow r1.9.

News

Date Update
2019-04-10 Testing Code
2019-04-07 Training Code
2019-04-05 Initial Code Upload

Citation

@article{warpgan,
  title = {WarpGAN: Automatic Caricature Generation},
  author = {Shi, Yichun, Deb, Debayan and Jain, Anil K.},
  booktitle = {CVPR},
  year = {2019}
}

Usage

Note: In this section, we assume that you are always in the directory $WARPGAN_ROOT/

Preprocessing

  1. Download the original images of WebCaricature dataset and unzip them into data/WebCaricature/OriginalImages. Rename the images by running
    python data/rename.py
    
  2. Then, normalize all the faces by running the following code:
    python align/align_dataset.py data/landmarks.txt data/webcaricacture_aligned_256 --scale 0.7
    
    The command will normalize all the photos and caricatures using the landmark points pre-defined in the WebCaricature protocol (we use only 5 landmarks). Notice that during deployment, we will use MTCNN to detect the face landmarks for images not in the dataset.

Training

  1. Before training, you need to download the discriminator model to initialize the parameters of the disrcimanator, which is pre-trained as an identity classifier. Unzip the files under pretrained/discriminator_casia_256/.

  2. The configuration files for training are saved under config/ folder, where you can define the dataset prefix, training list, model file and other hyper-parameters. Use the following command to run the default training configuration:

    python train.py config/default.py
    

    The command will create an folder under log/default/ which saves all the checkpoints, test samples and summaries. The model directory is named as the time you start training.

Testing

  • Run the test code in the following format:
    python test.py /path/to/model/dir /path/to/input/image /prefix/of/output/image
    
  • For example, if you want to use the pre-trained model, download the model and unzip it into pretrained/warpgan_pretrained. Then, run the following command to generate 5 images for captain marvel of different random styles:
    python test.py pretrained/warpgan_pretrained \
    data/example/CaptainMarvel.jpg \
    result/CaptainMarvel \
    --num_styles 5
    
  • You can also change the warping extent by using the --scale argument. For example, the following command doubles the displacement of the warpping control points:
    python test.py pretrained/warpgan_pretrained \
    data/example/CaptainMarvel.jpg \
    result/CaptainMarvel \
    --num_styles 5 --scale 2.0
    

Pre-trained Model

Discriminator Initializaiton:

Google Drive

WarpGAN:

Google Drive

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].