All Projects → sanghoon → prediction_gan

sanghoon / prediction_gan

Licence: other
PyTorch Impl. of Prediction Optimizer (to stabilize GAN training)

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to prediction gan

Viz torch optim
Videos of deep learning optimizers moving on 3D problem-landscapes
Stars: ✭ 86 (+177.42%)
Mutual labels:  optimizer
React Lite
An implementation of React v15.x that optimizes for small script size
Stars: ✭ 1,734 (+5493.55%)
Mutual labels:  optimizer
AshBF
Over-engineered Brainfuck optimizing compiler and interpreter
Stars: ✭ 14 (-54.84%)
Mutual labels:  optimizer
Glsl Optimizer
GLSL optimizer based on Mesa's GLSL compiler. Used to be used in Unity for mobile shader optimization.
Stars: ✭ 1,506 (+4758.06%)
Mutual labels:  optimizer
Lookahead pytorch
pytorch implement of Lookahead Optimizer
Stars: ✭ 138 (+345.16%)
Mutual labels:  optimizer
Image Optimizer
Easily optimize images using PHP
Stars: ✭ 2,127 (+6761.29%)
Mutual labels:  optimizer
Jhc Components
JHC Haskell compiler split into reusable components
Stars: ✭ 55 (+77.42%)
Mutual labels:  optimizer
Pytorch-Basic-GANs
Simple Pytorch implementations of most used Generative Adversarial Network (GAN) varieties.
Stars: ✭ 101 (+225.81%)
Mutual labels:  dcgan
Nn dataflow
Explore the energy-efficient dataflow scheduling for neural networks.
Stars: ✭ 141 (+354.84%)
Mutual labels:  optimizer
neth-proxy
Stratum <-> Stratum Proxy and optimizer for ethminer
Stars: ✭ 35 (+12.9%)
Mutual labels:  optimizer
Adahessian
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Stars: ✭ 114 (+267.74%)
Mutual labels:  optimizer
Image Optimize Command
Easily optimize images using WP CLI
Stars: ✭ 138 (+345.16%)
Mutual labels:  optimizer
Draftfast
A tool to automate and optimize DraftKings and FanDuel lineup construction.
Stars: ✭ 192 (+519.35%)
Mutual labels:  optimizer
Adamw keras
AdamW optimizer for Keras
Stars: ✭ 106 (+241.94%)
Mutual labels:  optimizer
artificial-neural-variability-for-deep-learning
The PyTorch Implementation of Variable Optimizers/ Neural Variable Risk Minimization proposed in our Neural Computation paper: Artificial Neural Variability for Deep Learning: On overfitting, Noise Memorization, and Catastrophic Forgetting.
Stars: ✭ 34 (+9.68%)
Mutual labels:  optimizer
Gpx Simplify Optimizer
Free Tracks Optimizer Online Service
Stars: ✭ 61 (+96.77%)
Mutual labels:  optimizer
Pytorch Optimizer
torch-optimizer -- collection of optimizers for Pytorch
Stars: ✭ 2,237 (+7116.13%)
Mutual labels:  optimizer
XTR-Toolbox
🛠 Versatile tool to optimize Windows
Stars: ✭ 138 (+345.16%)
Mutual labels:  optimizer
horoscope
horoscope is an optimizer inspector for DBMS.
Stars: ✭ 34 (+9.68%)
Mutual labels:  optimizer
Radam
On the Variance of the Adaptive Learning Rate and Beyond
Stars: ✭ 2,442 (+7777.42%)
Mutual labels:  optimizer

Prediction Optimizer (to stabilize GAN training)

Introduction

This is a PyTorch implementation of 'prediction method' introduced in the following paper ...

  • Abhay Yadav et al., Stabilizing Adversarial Nets with Prediction Methods, ICLR 2018, Link
  • (Just for clarification, I'm not an author of the paper.)

The authors proposed a simple (but effective) method to stabilize GAN trainings. With this Prediction Optimizer, you can easily apply the method to your existing GAN codes. This impl. is compatible with most of PyTorch optimizers and network structures. (Please let me know if you have any issues using this)

How-to-use

Instructions

  • Import prediction.py
    • from prediction import PredOpt
  • Initialize just like an optimizer
    • pred = PredOpt(net.parameters())
  • Run the model in a 'with' block to get results from a model with predicted params.
    • With 'step' argument, you can control lookahead step size (1.0 by default)
    • with pred.lookahead(step=1.0):
          output = net(input)
  • Call step() after an update of the network parameters
    • optim_net.step()
      pred.step()

Samples

  • You can find a sample code in this repository (example_gan.py)
  • A sample snippet
  • import torch.optim as optim
    from prediction import PredOpt
    
    
    # ...
    
    optim_G = optim.Adam(netG.parameters(), lr=0.01)
    optim_D = optim.Adam(netD.parameters(), lr=0.01)
    
    pred_G = PredOpt(netG.parameters())             # Create an prediction optimizer with target parameters
    pred_D = PredOpt(netD.parameters())
    
    
    for i, data in enumerate(dataloader, 0):
        # (1) Training D with samples from predicted generator
        with pred_G.lookahead(step=1.0):            # in the 'with' block, the model works as a 'predicted' model
            fake_predicted = netG(Z)                           
        
            # Compute gradients and loss 
        
            optim_D.step()
            pred_D.step()
        
        # (2) Training G
        with pred_D.lookahead(step=1.0:)            # 'Predicted D'
            fake = netG(Z)                          # Draw samples from the real model. (not predicted one)
            D_outs = netD(fake)
    
            # Compute gradients and loss
    
            optim_G.step()
            pred_G.step()                           # You should call PredOpt.step() after each update

Output samples

You can find more images at the following issues.

Training w/ large learning rate (0.01)

Vanilla DCGAN DCGAN w/ prediction (step=1.0)
ep25_cifar_base_lr 0 01 ep25_cifar_pred_lr 0 01
ep25_celeba_base_lr 0 01 ep25_celeba_pred_lr 0 01

Training w/ medium learning rate (1e-4)

Vanilla DCGAN DCGAN w/ prediction (step=1.0)
ep25_cifar_base_lr 0 0001 ep25_cifar_pred_lr 0 0001
ep25_celeba_base_lr 0 0001 ep25_celeba_pred_lr 0 0001

Training w/ small learning rate (1e-5)

Vanilla DCGAN DCGAN w/ prediction (step=1.0)
ep25_cifar_base_lr 0 00001 ep25_cifar_pred_lr 0 00001
ep25_celeba_base_lr 0 00001 ep25_celeba_pred_lr 0 00001

External links

TODOs

  • : Impl. as an optimizer
  • : Support pip install
  • : Add some experimental results
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].