All Projects → pietrocarbo → deep-transfer

pietrocarbo / deep-transfer

Licence: Apache-2.0 License
PyTorch implementation of "Universal Style Transfer via Feature Trasforms"

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to deep-transfer

wc-gan
Whitening and Coloring transform for GANs
Stars: ✭ 33 (-54.17%)
Mutual labels:  whitening
pytorch-neural-style-transfer-johnson
Reconstruction of the fast neural style transfer (Johnson et al.). Some portions of the paper have been improved by the follow-up work like the instance normalization, etc. Checkout transformer_net.py's header for details.
Stars: ✭ 85 (+18.06%)
Mutual labels:  neural-style-transfer
Neural-Zoom
Infinite Zoom For Style Transfer
Stars: ✭ 34 (-52.78%)
Mutual labels:  neural-style-transfer
Shakespearizing-Modern-English
Code for "Jhamtani H.*, Gangal V.*, Hovy E. and Nyberg E. Shakespearizing Modern Language Using Copy-Enriched Sequence to Sequence Models" Workshop on Stylistic Variation, EMNLP 2017
Stars: ✭ 64 (-11.11%)
Mutual labels:  neural-style-transfer
Keras-Style-Transfer
An implementation of "A Neural Algorithm of Artistic Style" in Keras
Stars: ✭ 36 (-50%)
Mutual labels:  neural-style-transfer
neural-flow-style
Neural Style Transfer for Fluids
Stars: ✭ 101 (+40.28%)
Mutual labels:  neural-style-transfer
zca
ZCA whitening in python
Stars: ✭ 29 (-59.72%)
Mutual labels:  zca
color-aware-style-transfer
Reference code for the paper CAMS: Color-Aware Multi-Style Transfer.
Stars: ✭ 36 (-50%)
Mutual labels:  neural-style-transfer
neural-style-transfer
艺术风格转换
Stars: ✭ 19 (-73.61%)
Mutual labels:  neural-style-transfer
semi-supervised-paper-implementation
Reproduce some methods in semi-supervised papers.
Stars: ✭ 35 (-51.39%)
Mutual labels:  zca
Domain-Aware-Style-Transfer
Official Implementation of Domain-Aware Universal Style Transfer
Stars: ✭ 84 (+16.67%)
Mutual labels:  neural-style-transfer
PyTorch-deep-photo-styletransfer
PyTorch implementation of "Deep Photo Style Transfer": https://arxiv.org/abs/1703.07511
Stars: ✭ 23 (-68.06%)
Mutual labels:  neural-style-transfer

deep-transfer

This is a Pytorch implementation of the "Universal Style Transfer via Feature Trasforms" NIPS17 paper.

Given a content image and an arbitrary style image, the program attempts to transfer the visual style characteristics extracted from the style image to the content image generating stylized ouput.

The core architecture is a VGG19 Convolutional Autoencoder performing Whitening and Coloring Transformation on the content and style features in the bottleneck layer.

Installation

  • Needed Python packages can be installed using conda package manager by running conda env create -f environment.yaml

Functionalities

Available modalities are:

  • style transfer (inputs: a content image and a style image);

  • texture synthesis (inputs: a texture style image);

  • style transfer interpolation (inputs: a content image and 2 style images);

  • texture synthesis interpolation (inputs: 2 texture style images);

  • spatial control over stylization (inputs: a content image, a binary mask of the same size and 2 style images for background-foreground stylization).

Usage

python main.py ARGS

Possible ARGS are:

  • -h, --help show this help message and exit;
  • --content CONTENT path of the content image (or a directory containing images) to be trasformed;
  • --style STYLE path of the style image (or a directory containing images) to use;
  • --synthesis flag to syntesize a new texture. Must also provide a texture style image;
  • --stylePair STYLEPAIR path of two style images (separated by ",") to combine together;
  • --mask MASK path of the binary mask image (white on black) to use to trasfer the style pair in the corrisponding areas;
  • --contentSize CONTENTSIZE reshape content image to have the new specified maximum size (keeping aspect ratio);
  • --styleSize STYLESIZE reshape style image to have the new specified maximum size (keeping aspect ratio);
  • --outDir OUTDIR path of the directory where stylized results will be saved (default is outputs/);
  • --outPrefix OUTPREFIX name prefixed in the saved stylized images;
  • --alpha ALPHA hyperparameter balancing the blending between original content features and WCT-transformed features (default is 0.2);
  • --beta BETA hyperparameter balancing the interpolation between the two images in the stylePair (default is 0.5;)
  • --no-cuda flag to enable CPU-only computations (default is False i.e. GPU (CUDA) accelaration);
  • --single-level flag to use single-level stylization (default is False).

Supported image file formats are: jpg, jpeg, png.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].