All Projects → bentrevett → a-tour-of-pytorch-optimizers

bentrevett / a-tour-of-pytorch-optimizers

Licence: MIT License
A tour of different optimization algorithms in PyTorch.

Programming Languages

Jupyter Notebook
11667 projects

Projects that are alternatives of or similar to a-tour-of-pytorch-optimizers

Pygmo2
A Python platform to perform parallel computations of optimisation tasks (global and local) via the asynchronous generalized island model.
Stars: ✭ 134 (+191.3%)
Mutual labels:  optimization, optimization-algorithms
Fewshotlearning
Pytorch implementation of the paper "Optimization as a Model for Few-Shot Learning"
Stars: ✭ 223 (+384.78%)
Mutual labels:  optimization, optimization-algorithms
Sporco
Sparse Optimisation Research Code
Stars: ✭ 164 (+256.52%)
Mutual labels:  optimization, optimization-algorithms
Cppnumericalsolvers
a lightweight C++17 library of numerical optimization methods for nonlinear functions (Including L-BFGS-B for TensorFlow)
Stars: ✭ 638 (+1286.96%)
Mutual labels:  optimization, optimization-algorithms
SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
Stars: ✭ 165 (+258.7%)
Mutual labels:  sgd, optimization-algorithms
Pyswarms
A research toolkit for particle swarm optimization in Python
Stars: ✭ 742 (+1513.04%)
Mutual labels:  optimization, optimization-algorithms
nuxt-prune-html
🔌⚡ Nuxt module to prune html before sending it to the browser (it removes elements matching CSS selector(s)), useful for boosting performance showing a different HTML for bots/audits by removing all the scripts with dynamic rendering
Stars: ✭ 69 (+50%)
Mutual labels:  optimization, optimization-algorithms
Ensmallen
A header-only C++ library for numerical optimization --
Stars: ✭ 436 (+847.83%)
Mutual labels:  optimization, optimization-algorithms
FactorizationMachine
implementation of factorization machine, support classification.
Stars: ✭ 19 (-58.7%)
Mutual labels:  sgd, adagrad
variants-of-rmsprop-and-adagrad
SC-Adagrad, SC-RMSProp and RMSProp algorithms for training deep networks proposed in
Stars: ✭ 14 (-69.57%)
Mutual labels:  adagrad, rmsprop
Solid
🎯 A comprehensive gradient-free optimization framework written in Python
Stars: ✭ 546 (+1086.96%)
Mutual labels:  optimization, optimization-algorithms
zoofs
zoofs is a python library for performing feature selection using a variety of nature-inspired wrapper algorithms. The algorithms range from swarm-intelligence to physics-based to Evolutionary. It's easy to use , flexible and powerful tool to reduce your feature size.
Stars: ✭ 142 (+208.7%)
Mutual labels:  optimization, optimization-algorithms
Pagmo2
A C++ platform to perform parallel computations of optimisation tasks (global and local) via the asynchronous generalized island model.
Stars: ✭ 540 (+1073.91%)
Mutual labels:  optimization, optimization-algorithms
Optimviz
Visualize optimization algorithms in MATLAB.
Stars: ✭ 106 (+130.43%)
Mutual labels:  optimization, optimization-algorithms
Awesome Robotics
A curated list of awesome links and software libraries that are useful for robots.
Stars: ✭ 478 (+939.13%)
Mutual labels:  optimization, optimization-algorithms
Python Mip
Collection of Python tools for the modeling and solution of Mixed-Integer Linear programs
Stars: ✭ 202 (+339.13%)
Mutual labels:  optimization, optimization-algorithms
Ojalgo
oj! Algorithms
Stars: ✭ 336 (+630.43%)
Mutual labels:  optimization, optimization-algorithms
Optim
OptimLib: a lightweight C++ library of numerical optimization methods for nonlinear functions
Stars: ✭ 411 (+793.48%)
Mutual labels:  optimization, optimization-algorithms
Argmin
Mathematical optimization in pure Rust
Stars: ✭ 234 (+408.7%)
Mutual labels:  optimization, optimization-algorithms
geneal
A genetic algorithm implementation in python
Stars: ✭ 47 (+2.17%)
Mutual labels:  optimization, optimization-algorithms

A Tour of PyTorch Optimizers

In this tutorial repo we'll be walking through different gradient descent optimization algorithms by describing how they work and then implementing them in PyTorch (using version 1.10).

This tutorial is aimed at people who understand the main concept of gradient descent - repeatedly taking steps against the direction of the gradient of a loss function calculated with respect to a set of parameters - but are unsure of how the common optimization algorithms work. Knowledge of how backpropagation is also not needed, all we need to know is that it is a method of calculating the gradients used for gradient descent. If you need to brush up on either of these concepts, I recommend Andrew Ng's Machine Learning course.

We'll cover the following optimization algorithms:

  • SGD
  • SGD with momentum
  • Adagrad
  • Adadelta
  • RMSprop
  • Adam

More may be added in the future!


The notebook is best rendered in Jupyter's NBViewer via this link as GitHub does a pretty poor job of rendering equations in notebooks.

If you find any mistakes or have any feedback, please submit an issue and I'll try and respond ASAP.


Resources

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].