All Projects → CyberZHG → keras-lookahead

CyberZHG / keras-lookahead

Licence: MIT license
Lookahead mechanism for optimizers in Keras.

Programming Languages

python
139335 projects - #7 most used programming language
shell
77523 projects

Projects that are alternatives of or similar to keras-lookahead

Pytorch Optimizer
torch-optimizer -- collection of optimizers for Pytorch
Stars: ✭ 2,237 (+4374%)
Mutual labels:  lookahead
TensorMONK
A collection of deep learning models (PyTorch implemtation)
Stars: ✭ 21 (-58%)
Mutual labels:  lookahead
Nn
🧑‍🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Stars: ✭ 5,720 (+11340%)
Mutual labels:  optimizers
flaxOptimizers
A collection of optimizers, some arcane others well known, for Flax.
Stars: ✭ 21 (-58%)
Mutual labels:  optimizers
ML-Optimizers-JAX
Toy implementations of some popular ML optimizers using Python/JAX
Stars: ✭ 37 (-26%)
Mutual labels:  optimizers
transformer
Neutron: A pytorch based implementation of Transformer and its variants.
Stars: ✭ 60 (+20%)
Mutual labels:  optimizers
submodlib
Summarize Massive Datasets using Submodular Optimization
Stars: ✭ 36 (-28%)
Mutual labels:  optimizers
Hypergradient variants
Improved Hypergradient optimizers, providing better generalization and faster convergence.
Stars: ✭ 15 (-70%)
Mutual labels:  optimizers
axon
Nx-powered Neural Networks
Stars: ✭ 1,170 (+2240%)
Mutual labels:  optimizers

Keras Lookahead

Travis Coverage

This repo is outdated and will no longer be maintained.

Unofficial implementation of the lookahead mechanism for optimizers.

Install

pip install git+https://github.com/cyberzhg/keras-lookahead.git

External Links

Usage

Arguments:

  • optimizer: Original optimizer.
  • sync_period: the k in the paper. The synchronization period.
  • slow_step: the α in the paper. The step size of slow weights.
from keras_lookahead import Lookahead

optimizer = Lookahead('adam', sync_period=5, slow_step=0.5)

Custom optimizers can also be used:

from keras_radam import RAdam
from keras_lookahead import Lookahead

optimizer = Lookahead(RAdam())
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].