All Projects → eBay → AutoOpt

eBay / AutoOpt

Licence: Apache-2.0 license
Automatic and Simultaneous Adjustment of Learning Rate and Momentum for Stochastic Gradient Descent

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to AutoOpt

Hypergradient variants
Improved Hypergradient optimizers, providing better generalization and faster convergence.
Stars: ✭ 15 (-65.91%)
Mutual labels:  momentum, learning-rate
randopt
Streamlined machine learning experiment management.
Stars: ✭ 108 (+145.45%)
Mutual labels:  hyperparameters
postcss-momentum-scrolling
PostCSS plugin add 'momentum' style scrolling behavior (-webkit-overflow-scrolling: touch) for elements with overflow (scroll, auto) on iOS
Stars: ✭ 69 (+56.82%)
Mutual labels:  momentum
TransE
TransE方法的Python实现,解释SGD中TransE的向量更新
Stars: ✭ 31 (-29.55%)
Mutual labels:  sgd
theedhum-nandrum
A sentiment classifier on mixed language (and mixed script) reviews in Tamil, Malayalam and English
Stars: ✭ 16 (-63.64%)
Mutual labels:  sgd
DiFacto2 ffm
Distributed Fieldaware Factorization Machines based on Parameter Server
Stars: ✭ 11 (-75%)
Mutual labels:  sgd
ML-Optimizers-JAX
Toy implementations of some popular ML optimizers using Python/JAX
Stars: ✭ 37 (-15.91%)
Mutual labels:  momentum
MLBlocks
A library for composing end-to-end tunable machine learning pipelines.
Stars: ✭ 94 (+113.64%)
Mutual labels:  hyperparameters
forecastVeg
A Machine Learning Approach to Forecasting Remotely Sensed Vegetation Health in Python
Stars: ✭ 44 (+0%)
Mutual labels:  hyperparameters
batchnorm-pruning
Rethinking the Smaller-Norm-Less-Informative Assumption in Channel Pruning of Convolution Layers https://arxiv.org/abs/1802.00124
Stars: ✭ 66 (+50%)
Mutual labels:  sgd
SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
Stars: ✭ 165 (+275%)
Mutual labels:  sgd
LinkOS-Android-Samples
Java based sample code for developing on Android. The demos in this repository are stored on separate branches. To navigate to a demo, please click branches.
Stars: ✭ 52 (+18.18%)
Mutual labels:  sgd
Awd Lstm Lm
LSTM and QRNN Language Model Toolkit for PyTorch
Stars: ✭ 1,834 (+4068.18%)
Mutual labels:  sgd
Ta
Technical Analysis Library using Pandas and Numpy
Stars: ✭ 2,649 (+5920.45%)
Mutual labels:  momentum
allennlp-optuna
⚡️ AllenNLP plugin for adding subcommands to use Optuna, making hyperparameter optimization easy
Stars: ✭ 33 (-25%)
Mutual labels:  hyperparameters
HAR
Recognize one of six human activities such as standing, sitting, and walking using a Softmax Classifier trained on mobile phone sensor data.
Stars: ✭ 18 (-59.09%)
Mutual labels:  momentum
numpy-neuralnet-exercise
Implementation of key concepts of neuralnetwork via numpy
Stars: ✭ 49 (+11.36%)
Mutual labels:  sgd
a-tour-of-pytorch-optimizers
A tour of different optimization algorithms in PyTorch.
Stars: ✭ 46 (+4.55%)
Mutual labels:  sgd
pytorch-lr-scheduler
PyTorch implementation of some learning rate schedulers for deep learning researcher.
Stars: ✭ 65 (+47.73%)
Mutual labels:  learning-rate
paradox
ParamHelpers Next Generation
Stars: ✭ 23 (-47.73%)
Mutual labels:  hyperparameters

AutoOpt

This package implements various optimizers with automatic and simultaneous adjustment of the learning rate and the momentum. The AutoOpt package can be used in a deep learning training instead of the regular optimizers that are available in the PyTorch framework. The mini-batch flow in a training is shown in the below figure.

Installation

This package is built and tested in Python 3.6. Create a venv and install the dependencies as follows:

python3 -m venv .env
source .env/bin/activate
pip install --upgrade pip
pip install torch torchvision

Now install the AutoOpt package from its source repository:

pip install [autoopt-path]

Examples

Please see the sample code provided in the examples folder to understand how this package can be used in training of various ML models.

Citing AutoOpt paper

Please cite the AutoOpt paper if you are using it in a scientific publication.

@inproceedings{9053316,
  author={T. {Lancewicki} and S. {Kopru}},
  booktitle={ICASSP 2020 - 2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)}, 
  title={Automatic and Simultaneous Adjustment of Learning Rate and Momentum for Stochastic Gradient-based Optimization Methods}, 
  year={2020},
  volume={},
  number={},
  pages={3127-3131}
}

License

Copyright 2019 eBay Inc.

Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at

https://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

Third Party Code Attribution

This software contains code licensed by third parties. See LICENSE.txt.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].