All Projects → angetato → Optimizers-for-Tensorflow

angetato / Optimizers-for-Tensorflow

Licence: other
Adam, NAdam and AAdam optimizers

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Optimizers-for-Tensorflow

ada-hessian
Easy-to-use AdaHessian optimizer (PyTorch)
Stars: ✭ 59 (+195%)
Mutual labels:  optimizer, adam
Radam
On the Variance of the Adaptive Learning Rate and Beyond
Stars: ✭ 2,442 (+12110%)
Mutual labels:  optimizer, adam
Glsl Optimizer
GLSL optimizer based on Mesa's GLSL compiler. Used to be used in Unity for mobile shader optimization.
Stars: ✭ 1,506 (+7430%)
Mutual labels:  optimizer
horoscope
horoscope is an optimizer inspector for DBMS.
Stars: ✭ 34 (+70%)
Mutual labels:  optimizer
Image Optimizer
Easily optimize images using PHP
Stars: ✭ 2,127 (+10535%)
Mutual labels:  optimizer
Keras Adabound
Keras implementation of AdaBound
Stars: ✭ 129 (+545%)
Mutual labels:  optimizer
Adahessian
ADAHESSIAN: An Adaptive Second Order Optimizer for Machine Learning
Stars: ✭ 114 (+470%)
Mutual labels:  optimizer
Viz torch optim
Videos of deep learning optimizers moving on 3D problem-landscapes
Stars: ✭ 86 (+330%)
Mutual labels:  optimizer
keras gradient noise
Add gradient noise to any Keras optimizer
Stars: ✭ 36 (+80%)
Mutual labels:  optimizer
Pytorch Optimizer
torch-optimizer -- collection of optimizers for Pytorch
Stars: ✭ 2,237 (+11085%)
Mutual labels:  optimizer
artificial-neural-variability-for-deep-learning
The PyTorch Implementation of Variable Optimizers/ Neural Variable Risk Minimization proposed in our Neural Computation paper: Artificial Neural Variability for Deep Learning: On overfitting, Noise Memorization, and Catastrophic Forgetting.
Stars: ✭ 34 (+70%)
Mutual labels:  optimizer
React Lite
An implementation of React v15.x that optimizes for small script size
Stars: ✭ 1,734 (+8570%)
Mutual labels:  optimizer
Image Optimize Command
Easily optimize images using WP CLI
Stars: ✭ 138 (+590%)
Mutual labels:  optimizer
neth-proxy
Stratum <-> Stratum Proxy and optimizer for ethminer
Stars: ✭ 35 (+75%)
Mutual labels:  optimizer
XTR-Toolbox
🛠 Versatile tool to optimize Windows
Stars: ✭ 138 (+590%)
Mutual labels:  optimizer
Adamw keras
AdamW optimizer for Keras
Stars: ✭ 106 (+430%)
Mutual labels:  optimizer
Nn dataflow
Explore the energy-efficient dataflow scheduling for neural networks.
Stars: ✭ 141 (+605%)
Mutual labels:  optimizer
Draftfast
A tool to automate and optimize DraftKings and FanDuel lineup construction.
Stars: ✭ 192 (+860%)
Mutual labels:  optimizer
LAMB Optimizer TF
LAMB Optimizer for Large Batch Training (TensorFlow version)
Stars: ✭ 119 (+495%)
Mutual labels:  optimizer
prediction gan
PyTorch Impl. of Prediction Optimizer (to stabilize GAN training)
Stars: ✭ 31 (+55%)
Mutual labels:  optimizer

Optimizers-for-Tensorflow

Adam, NAdam and AAdam (See below for details about this optimizer) optimizers

UPDATE July 2019: The accelerated solutions have been updated here , also the full paper explaining the solutions is available here.

Requirements

  • Tensorflow (Last version)
  • Python (3 or higher)
  • Your computer :-)

How to use

# training
cross_entropy = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(labels=y_, logits=y_conv))
train_step = AAdamOptimizer(2e-3).minimize(cross_entropy)

Results

UPDATE: I included 3 files to test the optimizers on mnist data. I do a small grid search for the learning rate so it can takes some time to execute. I recommand to use a cloud service if you do not have a GPU on your computer (mainly for the MLP and the CNN models).

I also included a small test file which implements a simple neural net and 8 optimizers including AAdam (the old version) and its variant. To run the code, simply go to command line an put python neuralnets-testing-optimizers.py. You don't need tensorflow to run this file and to quickly view the difference between optimizers. However, I recommand for more interesting cases that you use the separated pyhton files.

The optimizers are tested on the make_moons toy data set by sklearn, availaible here

Here are the results so far ... (leraning rate = 1e-2)

rmsprop => mean accuracy: 0.8698666666666667, std: 0.007847434117099816
sgd => mean accuracy: 0.8794666666666666, std: 0.001359738536958064
adam => mean accuracy: 0.872, std: 0.009074506414492586
aadam1 => mean accuracy: 0.8741333333333333, std: 0.006607739569794042 <-- without the sign of the gradient
nesterov => mean accuracy: 0.864, std: 0.021496046148071032
aadam2 => mean accuracy: 0.8784000000000001, std: 0.0011313708498984561 <-- with the sign of the gradient
adagrad => mean accuracy: 0.7981333333333334, std: 0.11408036153908735
momentum => mean accuracy: 0.7970666666666667, std: 0.025884529914388783

Enjoy !

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].