All Projects → lonePatient → Lookahead_pytorch

lonePatient / Lookahead_pytorch

Licence: mit
pytorch implement of Lookahead Optimizer

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Lookahead pytorch

Bsconv
Reference implementation for Blueprint Separable Convolutions (CVPR 2020)
Stars: ✭ 84 (-39.13%)
Mutual labels:  cifar10
Adamw keras
AdamW optimizer for Keras
Stars: ✭ 106 (-23.19%)
Mutual labels:  optimizer
Keras Adabound
Keras implementation of AdaBound
Stars: ✭ 129 (-6.52%)
Mutual labels:  optimizer
Viz torch optim
Videos of deep learning optimizers moving on 3D problem-landscapes
Stars: ✭ 86 (-37.68%)
Mutual labels:  optimizer
Spectralnormalizationkeras
Spectral Normalization for Keras Dense and Convolution Layers
Stars: ✭ 100 (-27.54%)
Mutual labels:  cifar10
Petridishnn
Code for the neural architecture search methods contained in the paper Efficient Forward Neural Architecture Search
Stars: ✭ 112 (-18.84%)
Mutual labels:  cifar10
Label Embedding Network
Label Embedding Network
Stars: ✭ 69 (-50%)
Mutual labels:  cifar10
Generative adversarial networks 101
Keras implementations of Generative Adversarial Networks. GANs, DCGAN, CGAN, CCGAN, WGAN and LSGAN models with MNIST and CIFAR-10 datasets.
Stars: ✭ 138 (+0%)
Mutual labels:  cifar10
Pytorch shake shake
A PyTorch implementation of shake-shake
Stars: ✭ 101 (-26.81%)
Mutual labels:  cifar10
Pytorch Speech Commands
Speech commands recognition with PyTorch
Stars: ✭ 128 (-7.25%)
Mutual labels:  cifar10
Pytorch Classification
Classification with PyTorch.
Stars: ✭ 1,268 (+818.84%)
Mutual labels:  cifar10
Virtual Adversarial Training
Pytorch implementation of Virtual Adversarial Training
Stars: ✭ 94 (-31.88%)
Mutual labels:  cifar10
Adahessian
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Stars: ✭ 114 (-17.39%)
Mutual labels:  optimizer
Tensorflow Cifar 10
Cifar-10 CNN implementation using TensorFlow library with 20% error.
Stars: ✭ 85 (-38.41%)
Mutual labels:  cifar10
Aognet
Code for CVPR 2019 paper: " Learning Deep Compositional Grammatical Architectures for Visual Recognition"
Stars: ✭ 132 (-4.35%)
Mutual labels:  cifar10
Cot
[ICLR'19] Complement Objective Training
Stars: ✭ 70 (-49.28%)
Mutual labels:  cifar10
Glsl Optimizer
GLSL optimizer based on Mesa's GLSL compiler. Used to be used in Unity for mobile shader optimization.
Stars: ✭ 1,506 (+991.3%)
Mutual labels:  optimizer
Image Optimize Command
Easily optimize images using WP CLI
Stars: ✭ 138 (+0%)
Mutual labels:  optimizer
Chainer Cifar10
Various CNN models for CIFAR10 with Chainer
Stars: ✭ 134 (-2.9%)
Mutual labels:  cifar10
Resnet On Cifar10
Reimplementation ResNet on cifar10 with caffe
Stars: ✭ 123 (-10.87%)
Mutual labels:  cifar10

Lookahead Pytorch

This repository contains a PyTorch implementation of the Lookahead Optimizer from the paper

Lookahead Optimizer: k steps forward, 1 step back

by Michael R. Zhang, James Lucas, Geoffrey Hinton and Jimmy Ba.

Dependencies

  • PyTorch
  • torchvision
  • matplotlib

Usage

The code in this repository implements both Lookahead and Adam training, with examples on the CIFAR-10 datasets.

To use Lookahead use the following command.

from optimizer import Lookahead
optimizer = optim.Adam(model.parameters(), lr=0.001)
optimizer = Lookahead(optimizer=optimizer,k=5,alpha=0.5)

We found that evaluation performance is typically better using the slow weights. This can be done in PyTorch with something like this in your eval loop:

if args.lookahead:
    optimizer._backup_and_load_cache()
    val_loss = eval_func(model)
    optimizer._clear_and_load_backup()

Example

To produce th result,we use CIFAR-10 dataset for ResNet18.

# use adam
python run.py --optimizer=adam

# use lookahead 
python run.py --optimizer=lookahead

Results

Train loss of adam and lookahead with ResNet18 on CIFAR-10.

Valid loss of adam and lookahead with ResNet18 on CIFAR-10.

Valid accuracy of adam and lookahead with ResNet18 on CIFAR-10.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].