All Projects → ANIME305 → Anime Gan Tensorflow

ANIME305 / Anime Gan Tensorflow

The BIGGAN based Anime generation implemented with tensorflow. All training data has been open sourced.

Projects that are alternatives of or similar to Anime Gan Tensorflow

Chess Surprise Analysis
Find surprising moves in chess games
Stars: ✭ 178 (-1.11%)
Mutual labels:  jupyter-notebook
Practical Torchtext
A set of tutorials for torchtext
Stars: ✭ 179 (-0.56%)
Mutual labels:  jupyter-notebook
Oct2py
Run M Files from Python - GNU Octave to Python bridge
Stars: ✭ 179 (-0.56%)
Mutual labels:  jupyter-notebook
Code Of Learn Deep Learning With Pytorch
This is code of book "Learn Deep Learning with PyTorch"
Stars: ✭ 2,262 (+1156.67%)
Mutual labels:  jupyter-notebook
Feature Selection For Machine Learning
Methods with examples for Feature Selection during Pre-processing in Machine Learning.
Stars: ✭ 178 (-1.11%)
Mutual labels:  jupyter-notebook
Ageron handson Ml
Hands-On Machine Learning with Scikit-Learn & TensorFlow (O'Reilly)
Stars: ✭ 179 (-0.56%)
Mutual labels:  jupyter-notebook
Lstm anomaly thesis
Anomaly detection for temporal data using LSTMs
Stars: ✭ 178 (-1.11%)
Mutual labels:  jupyter-notebook
Andrew Ng Notes
This is Andrew NG Coursera Handwritten Notes.
Stars: ✭ 180 (+0%)
Mutual labels:  jupyter-notebook
Machine Learning For Finance
Machine Learning for Finance, published by Packt
Stars: ✭ 179 (-0.56%)
Mutual labels:  jupyter-notebook
Introduction To Data Science In Python
This repository contains Ipython notebooks of assignments and tutorials used in the course introduction to data science in python, part of Applied Data Science using Python Specialization from University of Michigan offered by Coursera
Stars: ✭ 179 (-0.56%)
Mutual labels:  jupyter-notebook
Papergraph
AI/ML citation graph with postgres + graphql
Stars: ✭ 178 (-1.11%)
Mutual labels:  jupyter-notebook
Rnn Active User Forecast
1st place solution for the Kuaishou Active-user Forecast competition
Stars: ✭ 179 (-0.56%)
Mutual labels:  jupyter-notebook
Infiniteboost
InfiniteBoost: building infinite ensembles with gradient descent
Stars: ✭ 180 (+0%)
Mutual labels:  jupyter-notebook
Supercell
supercell
Stars: ✭ 178 (-1.11%)
Mutual labels:  jupyter-notebook
Deeplearning.ai
该存储库包含由deeplearning.ai提供的相关课程的个人的笔记和实现代码。
Stars: ✭ 181 (+0.56%)
Mutual labels:  jupyter-notebook
Python data science and machine learning bootcamp
Jupyter notebook for Udemy course: Python data science and machine learning bootcamp
Stars: ✭ 178 (-1.11%)
Mutual labels:  jupyter-notebook
Keraspp
코딩셰프의 3분 딥러닝, 케라스맛
Stars: ✭ 178 (-1.11%)
Mutual labels:  jupyter-notebook
Yolov3 Tf2
YoloV3 Implemented in Tensorflow 2.0
Stars: ✭ 2,327 (+1192.78%)
Mutual labels:  jupyter-notebook
Amass
Data preparation and loader for AMASS
Stars: ✭ 180 (+0%)
Mutual labels:  jupyter-notebook
Coco Analyze
A wrapper of the COCOeval class for extended keypoint error estimation analysis.
Stars: ✭ 179 (-0.56%)
Mutual labels:  jupyter-notebook

ANIME-FACE-GAN

Implement

1. SA-GAN

https://arxiv.org/pdf/1805.08318.pdf

Implemented with Attention, Conv2DTranspose, hinge loss and spectral norm.

The SAGAN was trained in batchsize=64 and cost only 3GB GPU memory. It needs about 50000 steps for training.

2. BIG-GAN

https://arxiv.org/abs/1809.11096?context=cs.LG

The BIGGAN was trained in batchsize=64 and cost 16GB GPU memory (batchsize=32 cost 10GB GPU memory for 1080Ti). It needs only 10000 steps for training.

results

state-of-art BIGGAN 12600 steps

12600 steps

GIF

gif

SAGAN 61600 steps (without the residual structure)

61600 steps

Open sourced dataset

We decide to opensource our used datasets.

The datasets are handlely cleaned and labeled, enjoy your own Playground with Gan! LOL

url:https://pan.baidu.com/s/1xXPeqr6SDnQkaNZcVHCZ7Q

extracting-code:u3bi

Loss

g_lossd_loss

Why the generator loss is crippled in 15k steps?

Model Records

SAGAN_V2: SAGAN + deconv

SAGAN_V3: SAGAN + deconv + bs=64 + truncated_normal

SAGAN_V4: SAGAN + upsample + bs=128 + truncated_normal

SAGAN_V5: SAGAN + deconv + bs=64 + lr_decay after 50k steps + ema for genrator

SAGAN_V6: SAGAN + deconv + bs=64 + ema for genrator

SResNetGAN_V0: SResNetGAN + pixelshuffler (failed)

SResNetGAN_V1: SResNetGAN + deconv (failed)

BIGGAN_V0: BIGGAN + generator 512

BIGGAN_V1: BIGGAN + generator 1024 (best now!)

Experience

  • Use truncated norm (std=0.5, truncated from -1 to 1) instead of uniform and Gaussian normal can help convergence.
  • Binomial distribution works badly.
  • Use Conv2DTranspose instead of Upsampling can improve the quality of images, and Upsampling also loses some diversities.
  • Bigger batch size (128, 256,...) dosen't achieve better performance in this project (not sure).
  • Ensure enough steps to train (at least 50k in SAGAN).
  • Add ExponentialMovingAverage to the generator can improve the stability of generated images.
  • It is important to remain close parameter size for both discriminator and generator.
  • The residual structure and increasing the parameters scale of both discriminator and generator can improve the image details for generated results.

Questions

  • Pixelshuffle works bad (pool diversity).
  • The hinge loss of discriminator usually equals 0 during the second half of training.
  • The quality of the generated images rapidly falling after several steps (70k in SAGAN, 14K in BIGGAN).

TODO

  • [x] Add ExponentialMovingAverage to the generator
  • [x] Learning rate exponentially decease after 50000 iterations of training (failed, not sure).
  • [x] Add labels from illustration2vec.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].