All Projects → mmahesh → variants-of-rmsprop-and-adagrad

mmahesh / variants-of-rmsprop-and-adagrad

Licence: other
SC-Adagrad, SC-RMSProp and RMSProp algorithms for training deep networks proposed in

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to variants-of-rmsprop-and-adagrad

a-tour-of-pytorch-optimizers
A tour of different optimization algorithms in PyTorch.
Stars: ✭ 46 (+228.57%)
Mutual labels:  adagrad, rmsprop
OLSTEC
OnLine Low-rank Subspace tracking by TEnsor CP Decomposition in Matlab: Version 1.0.1
Stars: ✭ 30 (+114.29%)
Mutual labels:  online-learning, stochastic-gradient-descent
SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
Stars: ✭ 165 (+1078.57%)
Mutual labels:  online-learning, stochastic-gradient-descent
Data Driven Web Apps With Flask
Course demo code and other hand-out materials for our data-driven web apps in Flask course
Stars: ✭ 388 (+2671.43%)
Mutual labels:  online-learning
Angel
A Flexible and Powerful Parameter Server for large-scale machine learning
Stars: ✭ 6,458 (+46028.57%)
Mutual labels:  online-learning
Online learning
Online Learning for Human Detection in 3D Point Clouds
Stars: ✭ 97 (+592.86%)
Mutual labels:  online-learning
Adas
Exploiting Explainable Metrics for Augmented SGD [CVPR2022]
Stars: ✭ 42 (+200%)
Mutual labels:  stochastic-gradient-descent
Train plus plus
Repo and code of the IEEE UIC paper: Train++: An Incremental ML Model Training Algorithm to Create Self-Learning IoT Devices
Stars: ✭ 17 (+21.43%)
Mutual labels:  online-learning
Nmflibrary
MATLAB library for non-negative matrix factorization (NMF): Version 1.8.1
Stars: ✭ 153 (+992.86%)
Mutual labels:  online-learning
Online Recurrent Extreme Learning Machine
Online-Recurrent-Extreme-Learning-Machine (OR-ELM) for time-series prediction, implemented in python
Stars: ✭ 95 (+578.57%)
Mutual labels:  online-learning
Roadmap
GitBook: OSCP RoadMap
Stars: ✭ 89 (+535.71%)
Mutual labels:  online-learning
Vowpal wabbit
Vowpal Wabbit is a machine learning system which pushes the frontier of machine learning with techniques such as online, hashing, allreduce, reductions, learning2search, active, and interactive learning.
Stars: ✭ 7,815 (+55721.43%)
Mutual labels:  online-learning
River
🌊 Online machine learning in Python
Stars: ✭ 2,980 (+21185.71%)
Mutual labels:  online-learning
Onlinemooc
Vue前台 + Django3.1 + DjangoRestful Framework + Ant Design Pro V4后台 开发的在线教育网站及后台管理
Stars: ✭ 587 (+4092.86%)
Mutual labels:  online-learning
Lycoris
A lightweight and easy-to-use deep learning framework with neural architecture search.
Stars: ✭ 180 (+1185.71%)
Mutual labels:  online-learning
Boost Cookbook
Online examples from "Boost C++ Application Development Cookbook":
Stars: ✭ 306 (+2085.71%)
Mutual labels:  online-learning
Continuum
A clean and simple data loading library for Continual Learning
Stars: ✭ 136 (+871.43%)
Mutual labels:  online-learning
Siddhi
Stream Processing and Complex Event Processing Engine
Stars: ✭ 1,185 (+8364.29%)
Mutual labels:  online-learning
Hypergan
Composable GAN framework with api and user interface
Stars: ✭ 1,104 (+7785.71%)
Mutual labels:  online-learning
Fwumious wabbit
Fwumious Wabbit, fast on-line machine learning toolkit written in Rust
Stars: ✭ 96 (+585.71%)
Mutual labels:  online-learning

Variants of RMSProp and Adagrad

Keras implementation of SC-Adagrad, SC-RMSProp and RMSProp Algorithms proposed in here

Short version accepted at ICML, 17 can be found here

I wrote a blog/tutorial here, describing Adagrad, RMSProp, Adam, SC-Adagrad and SC-RMSProp in simple terms, so that it is easy to understand the gist of the algorithms.

Usage

So, you created a deep network using keras, now you want to train with above algorithms. Copy the file "new_optimizers.py" into your repository. Then in the file where the model is created (also to be compiled) add the following

from new_optimizers import *

# lets for example you want to use SC-Adagrad then
# create optimizer object as follows.

sc_adagrad = SC_Adagrad()

# similarly for SC-RMSProp and RMSProp (Ours)

sc_rmsprop = SC_RMSProp()
rmsprop_variant = RMSProp_variant() 

Then in the code where you compile your keras model you must set optimizer=sc_adagrad. You can do the same for SC-RMSProp and RMSProp algorithms.

Overview of Algorithms




Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].