All Projects → tsoernes → gfsopt

tsoernes / gfsopt

Licence: MIT License
Convenient hyperparameter optimization

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to gfsopt

falcon
A WordPress cleanup and performance optimization plugin.
Stars: ✭ 17 (+41.67%)
Mutual labels:  optimization, optimizer
Windows11-Optimization
Community repository, to improve security and performance of Windows 10 and windows 11 with tweaks, commands, scripts, registry keys, configuration, tutorials and more
Stars: ✭ 17 (+41.67%)
Mutual labels:  optimization, optimizer
pigosat
Go (golang) bindings for Picosat, the satisfiability solver
Stars: ✭ 15 (+25%)
Mutual labels:  optimization, optimizer
goga
Go evolutionary algorithm is a computer library for developing evolutionary and genetic algorithms to solve optimisation problems with (or not) many constraints and many objectives. Also, a goal is to handle mixed-type representations (reals and integers).
Stars: ✭ 39 (+225%)
Mutual labels:  optimization, optimizer
rcppensmallen
Rcpp integration for the Ensmallen templated C++ mathematical optimization library
Stars: ✭ 28 (+133.33%)
Mutual labels:  optimization
rethinking-bnn-optimization
Implementation for the paper "Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization"
Stars: ✭ 62 (+416.67%)
Mutual labels:  optimizer
pyPESTO
python Parameter EStimation TOolbox
Stars: ✭ 93 (+675%)
Mutual labels:  optimization
siconos
Simulation framework for nonsmooth dynamical systems
Stars: ✭ 120 (+900%)
Mutual labels:  optimization
optimo
Keyframe-based motion editing system using numerical optimization [CHI 2018]
Stars: ✭ 22 (+83.33%)
Mutual labels:  optimization
Totsu
First-order conic solver for convex optimization problems
Stars: ✭ 18 (+50%)
Mutual labels:  optimization
NMFADMM
A sparsity aware implementation of "Alternating Direction Method of Multipliers for Non-Negative Matrix Factorization with the Beta-Divergence" (ICASSP 2014).
Stars: ✭ 39 (+225%)
Mutual labels:  optimization
opt einsum fx
Einsum optimization using opt_einsum and PyTorch FX graph rewriting
Stars: ✭ 13 (+8.33%)
Mutual labels:  optimization
cere
CERE: Codelet Extractor and REplayer
Stars: ✭ 27 (+125%)
Mutual labels:  optimization
arch-packages
Arch Linux performance important packages
Stars: ✭ 27 (+125%)
Mutual labels:  optimization
pytorch-minimize
Newton and Quasi-Newton optimization with PyTorch
Stars: ✭ 51 (+325%)
Mutual labels:  optimization
sam.pytorch
A PyTorch implementation of Sharpness-Aware Minimization for Efficiently Improving Generalization
Stars: ✭ 96 (+700%)
Mutual labels:  optimizer
GPU-Pathtracer
GPU Raytracer from scratch in C++/CUDA
Stars: ✭ 326 (+2616.67%)
Mutual labels:  optimization
mlrHyperopt
Easy Hyper Parameter Optimization with mlr and mlrMBO.
Stars: ✭ 30 (+150%)
Mutual labels:  optimization
a-tour-of-pytorch-optimizers
A tour of different optimization algorithms in PyTorch.
Stars: ✭ 46 (+283.33%)
Mutual labels:  optimization
csso-webpack-plugin
CSSO full restructuring minification files to serve your webpack bundles
Stars: ✭ 104 (+766.67%)
Mutual labels:  optimization

gfsopt

Documentation Status Latest Version

pip3 install --user gfsopt

Convenient scaffolding for the excellent Global Function Search (GFS) hyperparameter optimizer from the Dlib library.

Provides the following features:

  • Parallel optimization: Run multiple hyperparameter searches in parallel on multiple cores
  • Save and restore progress: Save/restore settings, parameters and optimization progress to/from file.
  • Average over multiple runs: Run a stochastic objective function using the same parameters multiple times and report the average to Dlib's Global Function Search. Useful in highly stochastic domains to avoid biasing the search towards lucky runs.

For theoretical background of GFS, see 'A Global Optimization Algorithm Worth Using' and Malherbe & Vayatis 2017: Global optimization of Lipschitz functions

Example usage

A basic example where we maximize obj_func with respect to y over 10 runs, with as many parallel processes as there are logical cores, and save progress to file.

from gfsopt import GFSOptimizer

def obj_func(x, y, pid):
    """"Function to be maximized (pid is iteration number)""""
    a = (1.5 - x + x * y)**2
    b = (2.25 - x + x * y * y)**2
    c = (2.625 - x + x * y * y * y)**2
    return -(a + b + c)
    
# For this example, we pretend that we want to keep 'x' fixed at 0.5
# while optimizing 'y' in the range -4.5 to 4.5
pp = {'x': 0.5}  # Fixed problem parameters
space = {'y': [-4.5, 4.5]}  # Parameters to optimize over
optimizer = GFSOptimizer(pp, space, fname="test.pkl")
# Will sample and test 'y' 10 times, then save results, progress and settings to file
optimizer.run(obj_func, n_sims=10)

For a more extensive example, see example.py.

Installation & Requirements

Requires Python >=3.6 and the following libraries:

datadiff
dlib
numpy

To install, do:

pip3 install --user gfsopt

Documentation

See example.py for an example and http://gfsopt.readthedocs.io/ for API documentation.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].