All Projects → mahdihosseini → Adas

mahdihosseini / Adas

Licence: MIT license
Exploiting Explainable Metrics for Augmented SGD [CVPR2022]

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to Adas

Deep-Learning-Optimization-Algorithms
Visualization of various deep learning optimization algorithms using PyTorch automatic differentiation and optimizers.
Stars: ✭ 47 (+11.9%)
Mutual labels:  stochastic-gradient-descent
Blur-and-Clear-Classification
Classifying the Blur and Clear Images
Stars: ✭ 88 (+109.52%)
Mutual labels:  stochastic-gradient-descent
machine-learning
Python machine learning applications in image processing, recommender system, matrix completion, netflix problem and algorithm implementations including Co-clustering, Funk SVD, SVD++, Non-negative Matrix Factorization, Koren Neighborhood Model, Koren Integrated Model, Dawid-Skene, Platt-Burges, Expectation Maximization, Factor Analysis, ISTA, F…
Stars: ✭ 91 (+116.67%)
Mutual labels:  stochastic-gradient-descent
SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
Stars: ✭ 165 (+292.86%)
Mutual labels:  stochastic-gradient-descent
fmin adam
Matlab implementation of the Adam stochastic gradient descent optimisation algorithm
Stars: ✭ 38 (-9.52%)
Mutual labels:  stochastic-gradient-descent
pydata-london-2018
Slides and notebooks for my tutorial at PyData London 2018
Stars: ✭ 22 (-47.62%)
Mutual labels:  stochastic-gradient-descent
psgd tf
Tensorflow implementation of preconditioned stochastic gradient descent
Stars: ✭ 33 (-21.43%)
Mutual labels:  stochastic-gradient-descent
retailbox
🛍️RetailBox - eCommerce Recommender System using Machine Learning
Stars: ✭ 32 (-23.81%)
Mutual labels:  stochastic-gradient-descent
OLSTEC
OnLine Low-rank Subspace tracking by TEnsor CP Decomposition in Matlab: Version 1.0.1
Stars: ✭ 30 (-28.57%)
Mutual labels:  stochastic-gradient-descent
variants-of-rmsprop-and-adagrad
SC-Adagrad, SC-RMSProp and RMSProp algorithms for training deep networks proposed in
Stars: ✭ 14 (-66.67%)
Mutual labels:  stochastic-gradient-descent

RMSGD: Augmented SGD Optimizer

Official PyTorch implementation of the RMSGD optimizer from:

Exploiting Explainable Metrics for Augmented SGD
Mahdi S. Hosseini, Mathieu Tuli, Konstantinos N. Plataniotis
Accepted in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR2022)


We propose new explainability metrics that measure the redundant information in a network's layers and exploit this information to augment the Stochastic Gradient Descent (SGD) optimizer by adaptively adjusting the learning rate in each layer. We call this new optimizer RMSGD. RMSGD is fast, performs better than existing sota, and generalizes well across experimental configurations.

Contents

This repository + branch contains the standalone optimizer, which is pip installable. Equally, you could copy the contents of src/rmsgd into your local repository and use the optimizer as is.

For all code relating to our paper and to replicate those experiments, see the paper branch

Installation

You can install rmsgd using pip install rmsgd, or equally:

git clone https://github.com/mahdihosseini/RMSGD.git
cd RMSGD
pip install .

Usage

RMSGD can be used like any other optimizer, with one additional step:

from rmsgd import RMSGD
...
optimizer = RMSGD(...)
...
for input in data_loader:
    optimizer.zero_grad()
    output = network(input)
    optimizer.step()
optimizer.epoch_step()

Simply, you must call .epoch_step() at the end of each epoch to update the analysis of the network layers.

Citation

@Article{hosseini2022rmsgd,
  author  = {Hosseini, Mahdi S. and Tuli, Mathieu and Plataniotis, Konstantinos N.},
  title   = {Exploiting Explainable Metrics for Augmented SGD},
  journal = {Accepted in IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year    = {2022},
}

License

This project is released under the MIT license. Please see the LICENSE file for more information.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].