All Projects → fonfonx → WassersteinGAN.torch

fonfonx / WassersteinGAN.torch

Licence: other
Torch implementation of Wasserstein GAN https://arxiv.org/abs/1701.07875

Programming Languages

lua
6591 projects

Projects that are alternatives of or similar to WassersteinGAN.torch

image-background-remove-tool
✂️ Automated high-quality background removal framework for an image using neural networks. ✂️
Stars: ✭ 767 (+1497.92%)
Mutual labels:  torch
probabilistic nlg
Tensorflow Implementation of Stochastic Wasserstein Autoencoder for Probabilistic Sentence Generation (NAACL 2019).
Stars: ✭ 28 (-41.67%)
Mutual labels:  wasserstein
torch-dataframe
Utility class to manipulate dataset from CSV file
Stars: ✭ 67 (+39.58%)
Mutual labels:  torch
inpainting FRRN
Progressive Image Inpainting (Kolmogorov Team solution for Huawei Hackathon 2019 summer)
Stars: ✭ 30 (-37.5%)
Mutual labels:  torch
Captcha-Cracking
Crack number and Chinese captcha with both traditional and deep learning methods, based on Torch and python.
Stars: ✭ 35 (-27.08%)
Mutual labels:  torch
ALIGNet
code to train a neural network to align pairs of shapes without needing ground truth warps for supervision
Stars: ✭ 58 (+20.83%)
Mutual labels:  torch
deepgenres.torch
Predict the genre of a song using the Torch deep learning library
Stars: ✭ 18 (-62.5%)
Mutual labels:  torch
deep-learning-platforms
deep-learning platforms,framework,data(深度学习平台、框架、资料)
Stars: ✭ 17 (-64.58%)
Mutual labels:  torch
S-WMD
Code for Supervised Word Mover's Distance (SWMD)
Stars: ✭ 90 (+87.5%)
Mutual labels:  wasserstein
sentence2vec
Deep sentence embedding using Sequence to Sequence learning
Stars: ✭ 23 (-52.08%)
Mutual labels:  torch
progressive-growing-of-gans.pytorch
Unofficial PyTorch implementation of "Progressive Growing of GANs for Improved Quality, Stability, and Variation".
Stars: ✭ 51 (+6.25%)
Mutual labels:  wasserstein-gan
progressive growing of GANs
Pure tensorflow implementation of progressive growing of GANs
Stars: ✭ 31 (-35.42%)
Mutual labels:  wasserstein-gan
vrn-torch-to-keras
Transfer pre-trained VRN model from torch to Keras/Tensorflow
Stars: ✭ 63 (+31.25%)
Mutual labels:  torch
eccv16 attr2img
Torch Implemention of ECCV'16 paper: Attribute2Image
Stars: ✭ 93 (+93.75%)
Mutual labels:  torch
gan-reverser
Reversing GAN image generation for similarity search and error/artifact fixing
Stars: ✭ 13 (-72.92%)
Mutual labels:  torch
yann
Yet Another Neural Network Library 🤔
Stars: ✭ 26 (-45.83%)
Mutual labels:  torch
hypnettorch
Package for working with hypernetworks in PyTorch.
Stars: ✭ 66 (+37.5%)
Mutual labels:  torch
torch-pitch-shift
Pitch-shift audio clips quickly with PyTorch (CUDA supported)! Additional utilities for searching efficient transformations are included.
Stars: ✭ 70 (+45.83%)
Mutual labels:  torch
Jetson-Nano-image
Jetson Nano image with deep learning frameworks
Stars: ✭ 46 (-4.17%)
Mutual labels:  torch
flambeau
Nim bindings to libtorch
Stars: ✭ 60 (+25%)
Mutual labels:  torch

Wasserstein GAN

This repository provides a Torch implementation of Wasserstein GAN as described by Arjovsky et. al. in their paper Wasserstein GAN.

Prerequisites

  • Torch
  • cutorch, cunn and cudnn to train the network on GPU. Training on CPU is supported but not recommended (very slow)

Please refer to the official Torch website to install Torch.

Usage

  1. Choose a dataset and create a folder with its name (ex: mkdir celebA; cd celebA). Inside this folder create another folder (images for example) containing your images.
    Note: You can download the celebA dataset on the celebA web page. Extract the images and run
DATA_ROOT=celebA th data/crop_celebA.lua
  1. Train the Wasserstein model
DATA_ROOT=<dataset_folder> name=<whatever_name_you_want> th main.lua

The networks are saved into the checkpoints/ directory with the name you gave.

  1. Generate images
net=<path_to_generator_network> name=<name_to_save_images> th generate.lua

Example:

net=checkpoints/generator.t7 name=myimages display=2929 th generate.lua

The generated images are saved in myimages.png.

Display images in a browser

If you want, install the display package (luarocks install display) and run

th -ldisplay.start <PORT_NUMBER> 0.0.0.0

to launch a server on the port you chose. You can access it in your browser with the url http://localhost:PORT_NUMBER.

To train your network or for completion add the variable display=<PORT_NUMBER> to the list of options.

Optional parameters

In your command line instructions you can specify several parameters (for example the display port number), here are some of them:

  • noise which can be either uniform or normal indicates the prior distribution from which the samples are generated
  • batchSize is the size of the batch used for training or the number of images to reconstruct
  • name is the name you want to use to save your networks or the generated images
  • gpu specifies if the computations are done on the GPU or not. Set it to 0 to use the CPU (not recommended, too slow) and to n to use the nth GPU you have (1 is the default value)
  • lr is the learning rate
  • loadSize is the size to use to scale the images. 0 means no rescale
  • niter is the number of epochs for training

References

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].