All Projects → Zardinality → Wgan Tensorflow

Zardinality / Wgan Tensorflow

a tensorflow implementation of WGAN

Projects that are alternatives of or similar to Wgan Tensorflow

Zhihu
This repo contains the source code in my personal column (https://zhuanlan.zhihu.com/zhaoyeyu), implemented using Python 3.6. Including Natural Language Processing and Computer Vision projects, such as text generation, machine translation, deep convolution GAN and other actual combat code.
Stars: ✭ 3,307 (+478.15%)
Mutual labels:  jupyter-notebook, gan
Simgan Captcha
Solve captcha without manually labeling a training set
Stars: ✭ 405 (-29.2%)
Mutual labels:  jupyter-notebook, gan
T81 558 deep learning
Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks
Stars: ✭ 4,152 (+625.87%)
Mutual labels:  jupyter-notebook, gan
Nn
🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+900%)
Mutual labels:  jupyter-notebook, gan
Generative Adversarial Networks
Introduction to generative adversarial networks, with code to accompany the O'Reilly tutorial on GANs
Stars: ✭ 505 (-11.71%)
Mutual labels:  jupyter-notebook, gan
Pytorch Lesson Zh
pytorch 包教不包会
Stars: ✭ 279 (-51.22%)
Mutual labels:  jupyter-notebook, gan
Sdv
Synthetic Data Generation for tabular, relational and time series data.
Stars: ✭ 360 (-37.06%)
Mutual labels:  jupyter-notebook, gan
Dragan
A stable algorithm for GAN training
Stars: ✭ 189 (-66.96%)
Mutual labels:  jupyter-notebook, gan
Gantts
PyTorch implementation of GAN-based text-to-speech synthesis and voice conversion (VC)
Stars: ✭ 460 (-19.58%)
Mutual labels:  jupyter-notebook, gan
Generative Models
Annotated, understandable, and visually interpretable PyTorch implementations of: VAE, BIRVAE, NSGAN, MMGAN, WGAN, WGANGP, LSGAN, DRAGAN, BEGAN, RaGAN, InfoGAN, fGAN, FisherGAN
Stars: ✭ 438 (-23.43%)
Mutual labels:  jupyter-notebook, gan
Gan Tutorial
Simple Implementation of many GAN models with PyTorch.
Stars: ✭ 227 (-60.31%)
Mutual labels:  jupyter-notebook, gan
Tf Tutorials
A collection of deep learning tutorials using Tensorflow and Python
Stars: ✭ 524 (-8.39%)
Mutual labels:  jupyter-notebook, gan
Gan steerability
On the "steerability" of generative adversarial networks
Stars: ✭ 225 (-60.66%)
Mutual labels:  jupyter-notebook, gan
Faceswap Gan
A denoising autoencoder + adversarial losses and attention mechanisms for face swapping.
Stars: ✭ 3,099 (+441.78%)
Mutual labels:  jupyter-notebook, gan
Swapnet
Virtual Clothing Try-on with Deep Learning. PyTorch reproduction of SwapNet by Raj et al. 2018. Now with Docker support!
Stars: ✭ 202 (-64.69%)
Mutual labels:  jupyter-notebook, gan
Advanced Tensorflow
Little More Advanced TensorFlow Implementations
Stars: ✭ 364 (-36.36%)
Mutual labels:  jupyter-notebook, gan
Keraspp
코딩셰프의 3분 딥러닝, 케라스맛
Stars: ✭ 178 (-68.88%)
Mutual labels:  jupyter-notebook, gan
Gans From Theory To Production
Material for the tutorial: "Deep Diving into GANs: from theory to production"
Stars: ✭ 182 (-68.18%)
Mutual labels:  jupyter-notebook, gan
Deep Learning Resources
由淺入深的深度學習資源 Collection of deep learning materials for everyone
Stars: ✭ 422 (-26.22%)
Mutual labels:  jupyter-notebook, gan
Hidt
Official repository for the paper "High-Resolution Daytime Translation Without Domain Labels" (CVPR2020, Oral)
Stars: ✭ 513 (-10.31%)
Mutual labels:  jupyter-notebook, gan

Wasserstein GAN

This is a tensorflow implementation of WGAN on mnist and SVHN.

Requirement

tensorflow==1.0.0+

numpy

matplotlib

cv2

Usage

Train: Use WGAN.ipynb, set the parameters in the second cell and choose the dataset you want to run on. You can use tensorboard to visualize the training.

Generation : Use generate_from_ckpt.ipynb, set ckpt_dir in the second cell. Don't forget to change the dataset type accordingly.

Note

  1. All data will be downloaded automatically, the SVHN script is modified from this.

  2. All parameters are set to the values the original paper recommends by default. Diters represents the number of critic updates in one step, in Original PyTorch version it was set to 5 unless iterstep < 25 or iterstep % 500 == 0 , I guess since the critic is free to fully optimized, it's reasonable to make more updates to critic at the beginning and every 500 steps, so I borrowed it without tuning. The learning rates for generator and critic are both set to 5e-5 , since during the training time the gradient norms are always relatively high(around 1e3), I suggest no drastic change on learning rates.

  3. MLP version could take longer time to generate sharp image.

  4. In this implementation, the critic loss is tf.reduce_mean(fake_logit - true_logit), and generator loss is tf.reduce_mean(-fake_logit) . Actually, the whole system still works if you add a - before both of them, it doesn't matter. Recall that the critic loss in duality form is , and the set is symmetric about the sign. Substitute $f$ with $-f$ gives us , the opposite number of original form. The original PyTorch implementation takes the second form, this implementation takes the first, both will work equally. You might want to add the - and try it out.

  5. Please set your device you want to run on in the code, search tf.device and change accordingly. It runs on gpu:0 by default.

  6. Inproved-WGAN is added, but somehow the gradient norm is close to 1, so the square-gradient normalizer doesn't work. Couldn't figure out why. ​

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].