All Projects → EndyWon → GLStyleNet

EndyWon / GLStyleNet

Licence: MIT license
Semantic style transfer, code and data for "GLStyleNet: Exquisite Style Transfer Combining Global and Local Pyramid Features" (IET Computer Vision 2020)

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to GLStyleNet

mlmodelzoo
Build your iOS 11+ apps with the ready-to-use Core ML models below
Stars: ✭ 17 (-64.58%)
Mutual labels:  style-transfer
style-transfer-video-processor
This code extends the neural style transfer image processing technique to video by generating smooth transitions between several reference style images
Stars: ✭ 113 (+135.42%)
Mutual labels:  style-transfer
gans-2.0
Generative Adversarial Networks in TensorFlow 2.0
Stars: ✭ 76 (+58.33%)
Mutual labels:  style-transfer
MeuralPaint
TensorFlow implementation of CNN fast neural style transfer ⚡️ 🎨 🌌
Stars: ✭ 19 (-60.42%)
Mutual labels:  style-transfer
Domain-Aware-Style-Transfer
Official Implementation of Domain-Aware Universal Style Transfer
Stars: ✭ 84 (+75%)
Mutual labels:  style-transfer
style swap tensorflow
tensorflow code for Fast Patch-based Style Transfer of Arbitrary Style
Stars: ✭ 42 (-12.5%)
Mutual labels:  style-transfer
MIDI-VAE
No description or website provided.
Stars: ✭ 56 (+16.67%)
Mutual labels:  style-transfer
TitleStylist
Source code for our "TitleStylist" paper at ACL 2020
Stars: ✭ 72 (+50%)
Mutual labels:  style-transfer
Face-Sketch
Face Sketch Synthesis with Style Transfer using Pyramid Column Feature, WACV2018
Stars: ✭ 52 (+8.33%)
Mutual labels:  style-transfer
ImgFastNeuralStyleTransfer TensorFlow
快速风格迁移学习实践
Stars: ✭ 22 (-54.17%)
Mutual labels:  style-transfer
transformer-drg-style-transfer
This repository have scripts and Jupyter-notebooks to perform all the different steps involved in Transforming Delete, Retrieve, Generate Approach for Controlled Text Style Transfer
Stars: ✭ 97 (+102.08%)
Mutual labels:  style-transfer
Joint-Bilateral-Learning
An unofficial implementation of Joint Bilateral Learning for Real-time Universal photorealistic Style Transfer
Stars: ✭ 52 (+8.33%)
Mutual labels:  style-transfer
StyleGAN demo
The re-implementation of style-based generator idea
Stars: ✭ 22 (-54.17%)
Mutual labels:  style-transfer
Neural-Tile
A better tiling script for Neural-Style
Stars: ✭ 35 (-27.08%)
Mutual labels:  style-transfer
Artistic-Style-Transfer-using-Keras-Tensorflow
Art to Image Style Transfer using Keras and Tensorflow.
Stars: ✭ 22 (-54.17%)
Mutual labels:  style-transfer
ArtsyNetworks
Deep Learning + Arts
Stars: ✭ 24 (-50%)
Mutual labels:  style-transfer
StyleCLIPDraw
Styled text-to-drawing synthesis method. Featured at IJCAI 2022 and the 2021 NeurIPS Workshop on Machine Learning for Creativity and Design
Stars: ✭ 247 (+414.58%)
Mutual labels:  style-transfer
Android-Tensorflow-Style-Transfer
An Android app built with an artistic style transfer neural network
Stars: ✭ 31 (-35.42%)
Mutual labels:  style-transfer
color-aware-style-transfer
Reference code for the paper CAMS: Color-Aware Multi-Style Transfer.
Stars: ✭ 36 (-25%)
Mutual labels:  style-transfer
StyleTransfer-PyTorch
Implementation of image style transfer in PyTorch
Stars: ✭ 18 (-62.5%)
Mutual labels:  style-transfer

GLStyleNet

[update 1/12/2022]

paper: GLStyleNet: Exquisite Style Transfer Combining Global and Local Pyramid Features, published in IET Computer Vision 2020.

Arxiv paper: GLStyleNet: Higher Quality Style Transfer Combining Global and Local Pyramid Features.

Environment Required:

  • Python 3.6
  • TensorFlow 1.4.0
  • CUDA 8.0

Getting Started:

Step 1: clone this repo

git clone https://github.com/EndyWon/GLStyleNet
cd GLStyleNet

Step 2: download pre-trained vgg19 model

bash download_vgg19.sh

Step 3: run style transfer

  1. Script Parameters
  • --content : content image path
  • --content-mask : content image semantic mask
  • --style : style image path
  • --style-mask : style image semantic mask
  • --content-weight : weight of content, default=10
  • --local-weight : weight of local style loss
  • --semantic-weight : weight of semantic map constraint
  • --global-weight : weight of global style loss
  • --output : output image path
  • --smoothness : weight of image smoothing scheme
  • --init : image type to initialize, value='noise' or 'content' or 'style', default='content'
  • --iterations : number of iterations, default=500
  • --device : devices, value='gpu'(all available GPUs) or 'gpui'(e.g. gpu0) or 'cpu', default='gpu'
  • --class-num : count of semantic mask classes, default=5
  1. portrait style transfer (an example)

python GLStyleNet.py --content portrait/Seth.jpg --content-mask portrait/Seth_sem.png --style portrait/Gogh.jpg --style-mask portrait/Gogh_sem.png --content-weight 10 --local-weight 500 --semantic-weight 10 --global-weight 1 --init style --device gpu

!!!You can find all the iteration results in folder 'outputs'!!!

portraits

  1. Chinese ancient painting style transfer (an example)

python GLStyleNet.py --content Chinese/content.jpg --content-mask Chinese/content_sem.png --style Chinese/style.jpg --style-mask Chinese/style_sem.png --content-weight 10 --local-weight 500 --semantic-weight 2.5 --global-weight 0.5 --init content --device gpu

Chinese

  1. artistic and photo-realistic style transfer

artistic:

artistic

photo-realistic:

photo-realistic

Citation:

If you find this code useful for your research, please cite the paper:

@article{wang2020glstylenet,
  title={GLStyleNet: exquisite style transfer combining global and local pyramid features},
  author={Wang, Zhizhong and Zhao, Lei and Lin, Sihuan and Mo, Qihang and Zhang, Huiming and Xing, Wei and Lu, Dongming},
  journal={IET Computer Vision},
  volume={14},
  number={8},
  pages={575--586},
  year={2020},
  publisher={IET}
}

Acknowledgement:

The code was written based on Champandard's code.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].