Hypergradient variantsImproved Hypergradient optimizers, providing better generalization and faster convergence.
Stars: ✭ 15 (-59.46%)
Mutual labels: momentum, adam-optimizer, optimizers
a-tour-of-pytorch-optimizersA tour of different optimization algorithms in PyTorch.
Stars: ✭ 46 (+24.32%)
Mutual labels: optimization-algorithms, adam
OptimOptimLib: a lightweight C++ library of numerical optimization methods for nonlinear functions
Stars: ✭ 411 (+1010.81%)
Mutual labels: gradient-descent, optimization-algorithms
MindseyeNeural Networks in Java 8 with CuDNN and Aparapi
Stars: ✭ 8 (-78.38%)
Mutual labels: gradient-descent, optimization-algorithms
RadamOn the Variance of the Adaptive Learning Rate and Beyond
Stars: ✭ 2,442 (+6500%)
Mutual labels: adam, adam-optimizer
Machine-Learning-in-Python-WorkshopMy workshop on machine learning using python language to implement different algorithms
Stars: ✭ 89 (+140.54%)
Mutual labels: gradient-descent, optimization-algorithms
Cppnumericalsolversa lightweight C++17 library of numerical optimization methods for nonlinear functions (Including L-BFGS-B for TensorFlow)
Stars: ✭ 638 (+1624.32%)
Mutual labels: gradient-descent, optimization-algorithms
NmflibraryMATLAB library for non-negative matrix factorization (NMF): Version 1.8.1
Stars: ✭ 153 (+313.51%)
Mutual labels: gradient-descent, optimization-algorithms
GDLibraryMatlab library for gradient descent algorithms: Version 1.0.1
Stars: ✭ 50 (+35.14%)
Mutual labels: gradient-descent, optimization-algorithms
soptsopt:A simple python optimization library
Stars: ✭ 42 (+13.51%)
Mutual labels: gradient-descent, optimization-algorithms
fmin adamMatlab implementation of the Adam stochastic gradient descent optimisation algorithm
Stars: ✭ 38 (+2.7%)
Mutual labels: gradient-descent, optimization-algorithms
Learning-Lab-C-LibraryThis library provides a set of basic functions for different type of deep learning (and other) algorithms in C.This deep learning library will be constantly updated
Stars: ✭ 20 (-45.95%)
Mutual labels: optimization-algorithms, adam-optimizer
madam👩 Pytorch and Jax code for the Madam optimiser.
Stars: ✭ 46 (+24.32%)
Mutual labels: jax
adaboostAn implementation of the paper "A Short Introduction to Boosting"
Stars: ✭ 20 (-45.95%)
Mutual labels: optimization-algorithms
fedpaFederated posterior averaging implemented in JAX
Stars: ✭ 38 (+2.7%)
Mutual labels: jax
robustness-vitContains code for the paper "Vision Transformers are Robust Learners" (AAAI 2022).
Stars: ✭ 78 (+110.81%)
Mutual labels: jax
haskell-vaeLearning about Haskell with Variational Autoencoders
Stars: ✭ 18 (-51.35%)
Mutual labels: adam-optimizer
jaxdfA JAX-based research framework for writing differentiable numerical simulators with arbitrary discretizations
Stars: ✭ 50 (+35.14%)
Mutual labels: jax
lookahead tensorflowLookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
Stars: ✭ 25 (-32.43%)
Mutual labels: adam-optimizer
genealA genetic algorithm implementation in python
Stars: ✭ 47 (+27.03%)
Mutual labels: optimization-algorithms