All Projects → afruehstueck → Tilegan

afruehstueck / Tilegan

Licence: gpl-3.0
Code for TileGAN: Synthesis of Large-Scale Non-Homogeneous Textures (SIGGRAPH 2019)

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Tilegan

Vae For Image Generation
Implemented Variational Autoencoder generative model in Keras for image generation and its latent space visualization on MNIST and CIFAR10 datasets
Stars: ✭ 87 (-47.59%)
Mutual labels:  image-generation
Mlds2018spring
Machine Learning and having it Deep and Structured (MLDS) in 2018 spring
Stars: ✭ 124 (-25.3%)
Mutual labels:  image-generation
Focal Frequency Loss
Focal Frequency Loss for Generative Models
Stars: ✭ 141 (-15.06%)
Mutual labels:  image-generation
Lggan
[CVPR 2020] Local Class-Specific and Global Image-Level Generative Adversarial Networks for Semantic-Guided Scene Generation
Stars: ✭ 97 (-41.57%)
Mutual labels:  image-generation
Pytorch Generative
Easy generative modeling in PyTorch.
Stars: ✭ 112 (-32.53%)
Mutual labels:  image-generation
Oneshottranslation
Pytorch implementation of "One-Shot Unsupervised Cross Domain Translation" NIPS 2018
Stars: ✭ 135 (-18.67%)
Mutual labels:  image-generation
Generating Devanagari Using Draw
PyTorch implementation of DRAW: A Recurrent Neural Network For Image Generation trained on Devanagari dataset.
Stars: ✭ 82 (-50.6%)
Mutual labels:  image-generation
Mmediting
OpenMMLab Image and Video Editing Toolbox
Stars: ✭ 2,618 (+1477.11%)
Mutual labels:  image-generation
Icface
ICface: Interpretable and Controllable Face Reenactment Using GANs
Stars: ✭ 122 (-26.51%)
Mutual labels:  image-generation
Unetgan
Official Implementation of the paper "A U-Net Based Discriminator for Generative Adversarial Networks" (CVPR 2020)
Stars: ✭ 139 (-16.27%)
Mutual labels:  image-generation
Exprgan
Facial Expression Editing with Controllable Expression Intensity
Stars: ✭ 98 (-40.96%)
Mutual labels:  image-generation
Pwa Asset Generator
Automates PWA asset generation and image declaration. Automatically generates icon and splash screen images, favicons and mstile images. Updates manifest.json and index.html files with the generated images according to Web App Manifest specs and Apple Human Interface guidelines.
Stars: ✭ 1,787 (+976.51%)
Mutual labels:  image-generation
Gesturegan
[ACM MM 2018 Oral] GestureGAN for Hand Gesture-to-Gesture Translation in the Wild
Stars: ✭ 136 (-18.07%)
Mutual labels:  image-generation
Vue Pwa Asset Generator
PWA asset generator perfect with VueJS framework (but useful for all PWA!)
Stars: ✭ 97 (-41.57%)
Mutual labels:  image-generation
Tsit
[ECCV 2020 Spotlight] A Simple and Versatile Framework for Image-to-Image Translation
Stars: ✭ 141 (-15.06%)
Mutual labels:  image-generation
Ganspace
Discovering Interpretable GAN Controls [NeurIPS 2020]
Stars: ✭ 1,224 (+637.35%)
Mutual labels:  image-generation
Cyclegan
Software that can generate photos from paintings, turn horses into zebras, perform style transfer, and more.
Stars: ✭ 10,933 (+6486.14%)
Mutual labels:  image-generation
Scene generation
A PyTorch implementation of the paper: Specifying Object Attributes and Relations in Interactive Scene Generation
Stars: ✭ 158 (-4.82%)
Mutual labels:  image-generation
Vae Lagging Encoder
PyTorch implementation of "Lagging Inference Networks and Posterior Collapse in Variational Autoencoders" (ICLR 2019)
Stars: ✭ 153 (-7.83%)
Mutual labels:  image-generation
Fq Gan
Official implementation of FQ-GAN
Stars: ✭ 137 (-17.47%)
Mutual labels:  image-generation

TileGAN: Synthesis of Large-Scale Non-Homogeneous Textures

TileGAN We tackle the problem of texture synthesis in the setting where many input images are given and a large-scale output is required. We build on recent generative adversarial networks and propose two extensions in this paper. First, we propose an algorithm to combine outputs of GANs trained on a smaller resolution to produce a large-scale plausible texture map with virtually no boundary artifacts. Second, we propose a user interface to enable artistic control. Our quantitative and qualitative results showcase the generation of synthesized high-resolution maps consisting of up to hundreds of megapixels as a case in point.

Video

Watch our video on Youtube:

High Resolution Results

Some of our results can be viewed interactively on EasyZoom:

Code

The TileGAN application consists of two independent processes, the server and the client. Both can be run locally on your machine or you can choose to run the server on a remote location, depending on your hardware setup. All network operations are performed by the server process, which sends the result to the client for displaying.

Download our pre-trained networks

  • Download the network(s) to the location of your server (this can be your local machine or a remote server)
  • Extract the .zip to ./data. There should be a separate folder for each dataset in ./data (e.g. ./data/vangogh) containing a *_network.pkl, a *_descriptors.hdf5, a *_clusters.hdf5 and a *_kmeans.joblib file.

Setup server

  • Install requirements from requirements-pip.txt
  • Install hnswlib:
    git clone https://github.com/nmslib/hnswlib.git
    cd hnswlib/python_bindings
    python setup.py install
    
  • Run python tileGAN_server.py
  • The server process will start, then tell you the IP to connect to.

Setup client

  • Install Qt for Python. The easiest way to do this is using conda: conda install -c conda-forge qt pyside2
  • Install requirements from requirements-pip.txt
  • Run python tileGAN_client.py XX.XX.XX.XX (insert the IP from the server). If you're running the server on the same machine as the application, you can omit the IP address or use 'localhost'.

Using your own data/network (optional)

  • Train a network on own your data using Progressive Growing of GANs
  • run create_dataset path_to_pkl num_latents t_size num_clusters network_name (expected to take between 10 and 60 minutes depending on the specified sample size)
    • path_to_pkl the path to the trained network pickle of your ProGAN network
    • num_latents the size of the database entries (a good number would be 50K to 300K)
    • t_size the size of the output descriptors (an even number somewhere around 12 and 24)
    • num_clusters the number of clusters (approx. 8-16)
    • network_name the name you want to assign your network
  • run server and client and load network in UI from the drop down menu. First time the network is loaded, an ANN index is created (expected to take <5mins depending on sample size)

Using our application

TileGAN UI TileGAN Tutorial

Paper

available on arXiv or ACM

Authors

Anna Frühstück, Ibraheem Alhashim, Peter Wonka Contact: anna.fruehstueck (at) kaust.edu.sa

Citation

If you use this code for your research, please cite our paper:

@article{Fruehstueck2019TileGAN,
  title      = {{TileGAN}: Synthesis of Large-Scale Non-Homogeneous Textures},
  author     = {Fr\"{u}hst\"{u}ck, Anna and Alhashim, Ibraheem and Wonka, Peter},
  journal    = {ACM Transactions on Graphics (Proc. SIGGRAPH) },
  issue_date = {July 2019},
  volume     = {38},
  number     = {4},
  pages      = {58:1-58:11},
  year       = {2019}
}

Acknowledgements

Our project is based on ProGAN. We'd like to thank Tero Karras et al. for their great work and for making their code available.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].