All Projects → shreyansh26 → ML-Optimizers-JAX

shreyansh26 / ML-Optimizers-JAX

Licence: other
Toy implementations of some popular ML optimizers using Python/JAX

Programming Languages

python
139335 projects - #7 most used programming language
Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to ML-Optimizers-JAX

Hypergradient variants
Improved Hypergradient optimizers, providing better generalization and faster convergence.
Stars: ✭ 15 (-59.46%)
Mutual labels:  momentum, adam-optimizer, optimizers
a-tour-of-pytorch-optimizers
A tour of different optimization algorithms in PyTorch.
Stars: ✭ 46 (+24.32%)
Mutual labels:  optimization-algorithms, adam
Optim
OptimLib: a lightweight C++ library of numerical optimization methods for nonlinear functions
Stars: ✭ 411 (+1010.81%)
Mutual labels:  gradient-descent, optimization-algorithms
Mindseye
Neural Networks in Java 8 with CuDNN and Aparapi
Stars: ✭ 8 (-78.38%)
Mutual labels:  gradient-descent, optimization-algorithms
Radam
On the Variance of the Adaptive Learning Rate and Beyond
Stars: ✭ 2,442 (+6500%)
Mutual labels:  adam, adam-optimizer
Machine-Learning-in-Python-Workshop
My workshop on machine learning using python language to implement different algorithms
Stars: ✭ 89 (+140.54%)
Mutual labels:  gradient-descent, optimization-algorithms
Cppnumericalsolvers
a lightweight C++17 library of numerical optimization methods for nonlinear functions (Including L-BFGS-B for TensorFlow)
Stars: ✭ 638 (+1624.32%)
Mutual labels:  gradient-descent, optimization-algorithms
Nmflibrary
MATLAB library for non-negative matrix factorization (NMF): Version 1.8.1
Stars: ✭ 153 (+313.51%)
Mutual labels:  gradient-descent, optimization-algorithms
GDLibrary
Matlab library for gradient descent algorithms: Version 1.0.1
Stars: ✭ 50 (+35.14%)
Mutual labels:  gradient-descent, optimization-algorithms
sopt
sopt:A simple python optimization library
Stars: ✭ 42 (+13.51%)
Mutual labels:  gradient-descent, optimization-algorithms
fmin adam
Matlab implementation of the Adam stochastic gradient descent optimisation algorithm
Stars: ✭ 38 (+2.7%)
Mutual labels:  gradient-descent, optimization-algorithms
Learning-Lab-C-Library
This library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-45.95%)
Mutual labels:  optimization-algorithms, adam-optimizer
madam
👩 Pytorch and Jax code for the Madam optimiser.
Stars: ✭ 46 (+24.32%)
Mutual labels:  jax
adaboost
An implementation of the paper "A Short Introduction to Boosting"
Stars: ✭ 20 (-45.95%)
Mutual labels:  optimization-algorithms
fedpa
Federated posterior averaging implemented in JAX
Stars: ✭ 38 (+2.7%)
Mutual labels:  jax
robustness-vit
Contains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+110.81%)
Mutual labels:  jax
haskell-vae
Learning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-51.35%)
Mutual labels:  adam-optimizer
jaxdf
A JAX-based research framework for writing differentiable numerical simulators with arbitrary discretizations
Stars: ✭ 50 (+35.14%)
Mutual labels:  jax
lookahead tensorflow
Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
Stars: ✭ 25 (-32.43%)
Mutual labels:  adam-optimizer
geneal
A genetic algorithm implementation in python
Stars: ✭ 47 (+27.03%)
Mutual labels:  optimization-algorithms

ML Optimizers from scratch using JAX

Implementations of some popular optimizers from scratch for a simple model i.e., Linear Regression on a dataset of 5 features. The goal of this project was to understand how these optimizers work under the hood and try to do a toy implementation myself. I also use a bit of JAX magic to perform the differentiation of the loss function w.r.t to the weights and the bias without explicitly writing their derivatives as a separate function. This can help to generalize this notebook for other types of loss functions as well.

Kaggle Open In Colab

The optimizers I have implemented are -

  • Batch Gradient Descent
  • Batch Gradient Descent + Momentum
  • Nesterov Accelerated Momentum
  • Adagrad
  • RMSprop
  • Adam
  • Adamax
  • Nadam
  • Adabelief

References -

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].