All Projects → texturedesign → texturize

texturedesign / texturize

Licence: AGPL-3.0 license
🤖🖌️ Generate photo-realistic textures based on source images. Remix, remake, mashup! Useful if you want to create variations on a theme or elaborate on an existing texture.

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to texturize

Texturize
🤖🖌️ Generate photo-realistic textures based on source images. Remix, remake, mashup! Useful if you want to create variations on a theme or elaborate on an existing texture.
Stars: ✭ 366 (-26.06%)
Mutual labels:  generative-model, image-manipulation, image-generation
Gesturegan
[ACM MM 2018 Oral] GestureGAN for Hand Gesture-to-Gesture Translation in the Wild
Stars: ✭ 136 (-72.53%)
Mutual labels:  generative-model, image-manipulation, image-generation
Lggan
[CVPR 2020] Local Class-Specific and Global Image-Level Generative Adversarial Networks for Semantic-Guided Scene Generation
Stars: ✭ 97 (-80.4%)
Mutual labels:  generative-model, image-manipulation, image-generation
Pytorch Cyclegan And Pix2pix
Image-to-Image Translation in PyTorch
Stars: ✭ 16,477 (+3228.69%)
Mutual labels:  image-manipulation, image-generation
Focal Frequency Loss
Focal Frequency Loss for Generative Models
Stars: ✭ 141 (-71.52%)
Mutual labels:  image-manipulation, image-generation
Tsit
[ECCV 2020 Spotlight] A Simple and Versatile Framework for Image-to-Image Translation
Stars: ✭ 141 (-71.52%)
Mutual labels:  image-manipulation, image-generation
Exprgan
Facial Expression Editing with Controllable Expression Intensity
Stars: ✭ 98 (-80.2%)
Mutual labels:  image-manipulation, image-generation
TriangleGAN
TriangleGAN, ACM MM 2019.
Stars: ✭ 28 (-94.34%)
Mutual labels:  generative-model, image-generation
Graphite
Open source 2D node-based raster/vector graphics editor (Photoshop + Illustrator + Houdini = Graphite)
Stars: ✭ 223 (-54.95%)
Mutual labels:  image-manipulation, image-generation
swd
unsupervised video and image generation
Stars: ✭ 50 (-89.9%)
Mutual labels:  generative-model, image-generation
Vae For Image Generation
Implemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (-82.42%)
Mutual labels:  generative-model, image-generation
Oneshottranslation
Pytorch implementation of "One-Shot Unsupervised Cross Domain Translation" NIPS 2018
Stars: ✭ 135 (-72.73%)
Mutual labels:  image-manipulation, image-generation
Cyclegan
Software that can generate photos from paintings, turn horses into zebras, perform style transfer, and more.
Stars: ✭ 10,933 (+2108.69%)
Mutual labels:  image-manipulation, image-generation
Distancegan
Pytorch implementation of "One-Sided Unsupervised Domain Mapping" NIPS 2017
Stars: ✭ 180 (-63.64%)
Mutual labels:  image-manipulation, image-generation
Neural Doodle
Turn your two-bit doodles into fine artworks with deep neural networks, generate seamless textures from photos, transfer style from one image to another, perform example-based upscaling, but wait... there's more! (An implementation of Semantic Style Transfer.)
Stars: ✭ 9,680 (+1855.56%)
Mutual labels:  image-manipulation, image-generation
Finegan
FineGAN: Unsupervised Hierarchical Disentanglement for Fine-grained Object Generation and Discovery
Stars: ✭ 240 (-51.52%)
Mutual labels:  image-manipulation, image-generation
Contrastive Unpaired Translation
Contrastive unpaired image-to-image translation, faster and lighter training than cyclegan (ECCV 2020, in PyTorch)
Stars: ✭ 822 (+66.06%)
Mutual labels:  image-manipulation, image-generation
Pix2pix
Image-to-image translation with conditional adversarial nets
Stars: ✭ 8,765 (+1670.71%)
Mutual labels:  image-manipulation, image-generation
Generating Devanagari Using Draw
PyTorch implementation of DRAW: A Recurrent Neural Network For Image Generation trained on Devanagari dataset.
Stars: ✭ 82 (-83.43%)
Mutual labels:  generative-model, image-generation
procjam2018
Graph.ical, a procedural texture authoring application developed for PROCJAM 2018.
Stars: ✭ 42 (-91.52%)
Mutual labels:  texture-synthesis, texture-generation

texturize

docs/gravel-x4.webp

A command-line tool and Python library to automatically generate new textures similar to a source image or photograph. It's useful in the context of computer graphics if you want to make variations on a theme or expand the size of an existing texture.

This software is powered by deep learning technology — using a combination of convolution networks and example-based optimization to synthesize images. We're building texturize as the highest-quality open source library available!

  1. Examples & Demos
  2. Commands
  3. Options & Usage
  4. Installation

Python Version License Type Project Stars PyPI - Version PyPI - Status GitHub Workflow Status


1. Examples & Demos

The examples are available as notebooks, and you can run them directly in-browser thanks to Jupyter and Google Colab:

These demo materials are released under the Creative Commons BY-NC-SA license, including the text, images and code.

docs/grass-x4.webp

2. Commands

a) REMIX

Generate variations of any shape from a single texture.

Remix Command-Line

Usage:
    texturize remix SOURCE...

Examples:
    texturize remix samples/grass.webp --size=720x360
    texturize remix samples/gravel.png --size=512x512

Remix Library API

from texturize import api, commands, io

# The input could be any PIL Image in RGB mode.
image = io.load_image_from_file("input.png")

# Coarse-to-fine synthesis runs one octave at a time.
remix = commands.Remix(image)
for result in api.process_octaves(remix, octaves=5):
    pass

# The output can be saved in any PIL-supported format.
result.image.save("output.png")

Remix Examples

docs/remix-gravel.webp


b) REMAKE

Reproduce an original texture in the style of another.

Remake Command-Line

Usage:
    texturize remake TARGET [like] SOURCE

Examples:
    texturize remake samples/grass1.webp like samples/grass2.webp
    texturize remake samples/gravel1.png like samples/gravel2.png --weight 0.5

Remake Library API

from texturize import api, commands

# The input could be any PIL Image in RGB mode.
target = io.load_image_from_file("input1.png")
source = io.load_image_from_file("input2.png")

# Only process one octave to retain photo-realistic output.
remake = commands.Remake(target, source)
for result in api.process_octaves(remake, octaves=1):
    pass

# The output can be saved in any PIL-supported format.
result.image.save("output.png")

Remake Examples

docs/remake-grass.webp


c) MASHUP

Combine multiple textures together into one output.

Mashup Command-Line

Usage:
    texturize mashup SOURCE...

Examples:
    texturize mashup samples/grass1.webp samples/grass2.webp
    texturize mashup samples/gravel1.png samples/gravel2.png

Mashup Library API

from texturize import api, commands

# The input could be any PIL Image in RGB mode.
sources = [
    io.load_image_from_file("input1.png"),
    io.load_image_from_file("input2.png"),
]

# Only process one octave to retain photo-realistic output.
mashup = commands.Mashup(sources)
for result in api.process_octaves(mashup, octaves=5):
    pass

# The output can be saved in any PIL-supported format.
result.image.save("output.png")

Mashup Examples

docs/mashup-gravel.webp


d) ENHANCE

Increase the resolution or quality of a texture using another as an example.

Enhance Command-Line

Usage:
    texturize enhance TARGET [with] SOURCE --zoom=ZOOM

Examples:
    texturize enhance samples/grass1.webp with samples/grass2.webp --zoom=2
    texturize enhance samples/gravel1.png with samples/gravel2.png --zoom=4

Enhance Library API

from texturize import api, commands

# The input could be any PIL Image in RGB mode.
target = io.load_image_from_file("input1.png")
source = io.load_image_from_file("input2.png")

# Only process one octave to retain photo-realistic output.
enhance = commands.Enhance(target, source, zoom=2)
for result in api.process_octaves(enhance, octaves=2):
    pass

# The output can be saved in any PIL-supported format.
result.image.save("output.png")

Enhance Examples

docs/enhance-grass.webp


3. Options & Usage

For details about the command-line usage of the tool, see the tool itself:

texturize --help

Here are the command-line options currently available, which apply to most of the commands above:

Options:
    SOURCE                  Path to source image to use as texture.
    -s WxH, --size=WxH      Output resolution as WIDTHxHEIGHT. [default: 640x480]
    -o FILE, --output=FILE  Filename for saving the result, includes format variables.
                            [default: {command}_{source}{variation}.png]

    --weights=WEIGHTS       Comma-separated list of blend weights. [default: 1.0]
    --zoom=ZOOM             Integer zoom factor for enhancing. [default: 2]

    --variations=V          Number of images to generate at same time. [default: 1]
    --seed=SEED             Configure the random number generation.
    --mode=MODE             Either "patch" or "gram" to manually specify critics.
    --octaves=O             Number of octaves to process. Defaults to 5 for 512x512, or
                            4 for 256x256 equivalent pixel count.
    --quality=Q             Quality for optimization, higher is better. [default: 5]
    --device=DEVICE         Hardware to use, either "cpu" or "cuda".
    --precision=PRECISION   Floating-point format to use, "float16" or "float32".
    --quiet                 Suppress any messages going to stdout.
    --verbose               Display more information on stdout.
    -h, --help              Show this message.

4. Installation

Latest Release [recommended]

We suggest using Miniconda 3.x to manage your Python environments. Once the conda command-line tool is installed on your machine, there are setup scripts you can download directly from the repository:

# a) Use this if you have an *Nvidia GPU only*.
curl -s https://raw.githubusercontent.com/photogeniq/texturize/master/tasks/setup-cuda.yml -o setup.yml

# b) Fallback if you just want to run on CPU.
curl -s https://raw.githubusercontent.com/photogeniq/texturize/master/tasks/setup-cpu.yml -o setup.yml

Now you can create a fresh Conda environment for texture synthesis:

conda env create -n myenv -f setup.yml
conda activate myenv

NOTE: Any version of CUDA is suitable to run texturize as long as PyTorch is working. See the official PyTorch installation guide for alternatives ways to install the pytorch library.

Then, you can fetch the latest version of the library from the Python Package Index (PyPI) using the following command:

pip install texturize

Finally, you can check if everything worked by calling the command-line script:

texturize --help

You can use conda env remove -n myenv to delete the virtual environment once you are done.

Repository Install [developers]

If you're a developer and want to install the library locally, start by cloning the repository to your local disk:

git clone https://github.com/photogeniq/texturize.git

We also recommend using Miniconda 3.x for development. You can set up a new virtual environment called myenv by running the following commands, depending whether you want to run on CPU or GPU (via CUDA). For advanced setups like specifying which CUDA version to use, see the official PyTorch installation guide.

cd texturize

# a) Use this if you have an *Nvidia GPU only*.
conda env create -n myenv -f tasks/setup-cuda.yml

# b) Fallback if you just want to run on CPU.
conda env create -n myenv -f tasks/setup-cpu.yml

Once the virtual environment is created, you can activate it and finish the setup of texturize with these commands:

conda activate myenv
poetry install

Finally, you can check if everything worked by calling the script:

texturize --help

Use conda env remove -n myenv to remove the virtual environment once you are done.


Python Version License Type Project Stars PyPI - Version PyPI - Status GitHub Workflow Status

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].