All Projects → kozistr → AdaBound-tensorflow

kozistr / AdaBound-tensorflow

Licence: Apache-2.0 license
An optimizer that trains as fast as Adam and as good as SGD in Tensorflow

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to AdaBound-tensorflow

Pytorch Optimizer
torch-optimizer -- collection of optimizers for Pytorch
Stars: ✭ 2,237 (+4984.09%)
Mutual labels:  optimizer, adabound
Post-Tweaks
A post-installation batch script for Windows
Stars: ✭ 136 (+209.09%)
Mutual labels:  optimizer
EAGO.jl
A development environment for robust and global optimization
Stars: ✭ 106 (+140.91%)
Mutual labels:  optimizer
portfolio-optimizer
A library for portfolio optimization algorithms with python interface.
Stars: ✭ 19 (-56.82%)
Mutual labels:  optimizer
MNIST-TFLite
MNIST classifier built for TensorFlow Lite - Android, iOS and other "lite" platforms
Stars: ✭ 34 (-22.73%)
Mutual labels:  mnist
SimpNet-Tensorflow
A Tensorflow Implementation of the SimpNet Convolutional Neural Network Architecture
Stars: ✭ 16 (-63.64%)
Mutual labels:  mnist
MNIST-adversarial-images
Create adversarial images to fool a MNIST classifier in TensorFlow
Stars: ✭ 13 (-70.45%)
Mutual labels:  mnist
ToyDB
A ToyDB (for beginner) based on MIT 6.830 and CMU 15445
Stars: ✭ 25 (-43.18%)
Mutual labels:  optimizer
deeplearning-mpo
Replace FC2, LeNet-5, VGG, Resnet, Densenet's full-connected layers with MPO
Stars: ✭ 26 (-40.91%)
Mutual labels:  mnist
crohme-data-extractor
A modified extractor for the CROHME handwritten math symbols dataset.
Stars: ✭ 18 (-59.09%)
Mutual labels:  mnist
cuda-neural-network
Convolutional Neural Network with CUDA (MNIST 99.23%)
Stars: ✭ 118 (+168.18%)
Mutual labels:  mnist
numpy-neuralnet-exercise
Implementation of key concepts of neuralnetwork via numpy
Stars: ✭ 49 (+11.36%)
Mutual labels:  mnist
digitrecognition ios
Deep Learning with Tensorflow/Keras: Digit recognition based on mnist-dataset and convolutional neural-network on iOS with CoreML
Stars: ✭ 23 (-47.73%)
Mutual labels:  mnist
PaperSynth
Handwritten text to synths!
Stars: ✭ 18 (-59.09%)
Mutual labels:  mnist
ada-hessian
Easy-to-use AdaHessian optimizer (PyTorch)
Stars: ✭ 59 (+34.09%)
Mutual labels:  optimizer
image-defect-detection-based-on-CNN
TensorBasicModel
Stars: ✭ 17 (-61.36%)
Mutual labels:  mnist
gan-vae-pretrained-pytorch
Pretrained GANs + VAEs + classifiers for MNIST/CIFAR in pytorch.
Stars: ✭ 134 (+204.55%)
Mutual labels:  mnist
adamwr
Implements https://arxiv.org/abs/1711.05101 AdamW optimizer, cosine learning rate scheduler and "Cyclical Learning Rates for Training Neural Networks" https://arxiv.org/abs/1506.01186 for PyTorch framework
Stars: ✭ 130 (+195.45%)
Mutual labels:  optimizer
tensorflow-mnist-convnets
Neural nets for MNIST classification, simple single layer NN, 5 layer FC NN and convolutional neural networks with different architectures
Stars: ✭ 22 (-50%)
Mutual labels:  mnist
mnist-flask
A Flask web app for handwritten digit recognition using machine learning
Stars: ✭ 34 (-22.73%)
Mutual labels:  mnist

AdaBound in Tensorflow

An optimizer that trains as fast as Adam and as good as SGD in Tensorflow

This repo is based on pytorch impl original repo

Total alerts Language grade: Python

Explanation

An optimizer that trains as fast as Adam and as good as SGD, for developing state-of-the-art deep learning models on a wide variety of popular tasks in the field of CV, NLP, and etc.

Based on Luo et al. (2019). Adaptive Gradient Methods with Dynamic Bound of Learning Rate. In Proc. of ICLR 2019.

Requirement

  • Python 3.x
  • Tensorflow 1.x (maybe for 2.x)

Usage

# learning can be either a scalar or a tensor

# use exclude_from_weight_decay feature, 
# if you wanna selectively disable updating weight-decayed weights

optimizer = AdaBoundOptimizer(
    learning_rate=1e-3,
    final_lr=1e-1,
    beta_1=0.9,
    beta_2=0.999,
    gamma=1e-3,
    epsilon=1e-6,
    amsbound=False,
    decay=0.,
    weight_decay=0.,
    exclude_from_weight_decay=["..."]
)

You can simply test the optimizers on MNIST Dataset w/ below model!

For AdaBound optimizer,

python3 mnist_test --optimizer "adabound"

For AMSBound optimizer,

python3 mnist_test --optimizer "amsbound"

Results

Testing Accuracy & Loss among the optimizers on the several data sets w/ under same condition.

MNIST DataSet

acc

Optimizer Test Acc Time Etc
AdaBound 97.77% 5m 45s
AMSBound 97.72% 5m 52s
Adam 97.62% 4m 18s
AdaGrad 90.15% 4m 07s
SGD 87.88% 5m 26s
Momentum 87.88% 4m 26s w/ nestrov

Citation

@inproceedings{Luo2019AdaBound,
  author = {Luo, Liangchen and Xiong, Yuanhao and Liu, Yan and Sun, Xu},
  title = {Adaptive Gradient Methods with Dynamic Bound of Learning Rate},
  booktitle = {Proceedings of the 7th International Conference on Learning Representations},
  month = {May},
  year = {2019},
  address = {New Orleans, Louisiana}
}

Author

Hyeongchan Kim / kozistr

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].