All Projects → JuliaNLSolvers → Optim.jl

JuliaNLSolvers / Optim.jl

Licence: other
Optimization functions for Julia

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Optim.jl

Imagemin
[Unmaintained] Minify images seamlessly
Stars: ✭ 4,948 (+628.72%)
Mutual labels:  optimization
Glidebitmappool
Glide Bitmap Pool is a memory management library for reusing the bitmap memory
Stars: ✭ 544 (-19.88%)
Mutual labels:  optimization
Jenetics
Jenetics - Genetic Algorithm, Genetic Programming, Evolutionary Algorithm, and Multi-objective Optimization
Stars: ✭ 616 (-9.28%)
Mutual labels:  optimization
Notcpucores
Work, Play, Stream - Without the Stutter. Download using Releases button below
Stars: ✭ 514 (-24.3%)
Mutual labels:  optimization
Vroom
Vehicle Routing Open-source Optimization Machine
Stars: ✭ 533 (-21.5%)
Mutual labels:  optimization
Pymoo
NSGA2, NSGA3, R-NSGA3, MOEAD, Genetic Algorithms (GA), Differential Evolution (DE), CMAES, PSO
Stars: ✭ 547 (-19.44%)
Mutual labels:  optimization
Autokernel
AutoKernel 是一个简单易用,低门槛的自动算子优化工具,提高深度学习算法部署效率。
Stars: ✭ 485 (-28.57%)
Mutual labels:  optimization
Cppnumericalsolvers
a lightweight C++17 library of numerical optimization methods for nonlinear functions (Including L-BFGS-B for TensorFlow)
Stars: ✭ 638 (-6.04%)
Mutual labels:  optimization
Pagmo2
A C++ platform to perform parallel computations of optimisation tasks (global and local) via the asynchronous generalized island model.
Stars: ✭ 540 (-20.47%)
Mutual labels:  optimization
Teaser Plusplus
A fast and robust point cloud registration library
Stars: ✭ 607 (-10.6%)
Mutual labels:  optimization
Hyperparameter Optimization Of Machine Learning Algorithms
Implementation of hyperparameter optimization/tuning methods for machine learning & deep learning models (easy&clear)
Stars: ✭ 516 (-24.01%)
Mutual labels:  optimization
Apkgolf
The smallest Android APK in the world
Stars: ✭ 528 (-22.24%)
Mutual labels:  optimization
Bayesianoptimization
A Python implementation of global optimization with gaussian processes.
Stars: ✭ 5,611 (+726.36%)
Mutual labels:  optimization
Faceswap
3D face swapping implemented in Python
Stars: ✭ 508 (-25.18%)
Mutual labels:  optimization
Better Monadic For
Desugaring scala `for` without implicit `withFilter`s
Stars: ✭ 622 (-8.39%)
Mutual labels:  optimization
Lepto
Automated image Editing, Optimization and Analysis via CLI and a web interface. You give to lepto your input and output directories, the plugins you want to use and their options. Then lepto does his job, you keep your original files and the structure of the input directory. Some plugins can even collect data (like primary colors) from your images and save them in a JSON file.
Stars: ✭ 490 (-27.84%)
Mutual labels:  optimization
Solid
🎯 A comprehensive gradient-free optimization framework written in Python
Stars: ✭ 546 (-19.59%)
Mutual labels:  optimization
Hyperparameter hunter
Easy hyperparameter optimization and automatic result saving across machine learning algorithms and libraries
Stars: ✭ 648 (-4.57%)
Mutual labels:  optimization
Pydis
A redis clone in Python 3 to disprove some falsehoods about performance.
Stars: ✭ 623 (-8.25%)
Mutual labels:  optimization
Chillout
Reduce CPU usage by non-blocking async loop and psychologically speed up in JavaScript
Stars: ✭ 565 (-16.79%)
Mutual labels:  optimization

Optim.jl

Univariate and multivariate optimization in Julia.

Optim.jl is part of the JuliaNLSolvers family.

Documentation Build Status Social Reference to cite
Build Status JOSS
Build Status
Codecov branch

Optimization

Optim.jl is a package for univariate and multivariate optimization of functions. A typical example of the usage of Optim.jl is

using Optim
rosenbrock(x) =  (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
result = optimize(rosenbrock, zeros(2), BFGS())

This minimizes the Rosenbrock function

with a = 1, b = 100 and the initial values x=0, y=0. The minimum is at (a,a^2).

The above code gives the output


* Status: success

* Candidate solution
  Minimizer: [1.00e+00, 1.00e+00]
  Minimum:   5.471433e-17

* Found with
  Algorithm:     BFGS
  Initial Point: [0.00e+00, 0.00e+00]

* Convergence measures
  |x - x'|               = 3.47e-07 ≰ 0.0e+00
  |x - x'|/|x'|          = 3.47e-07 ≰ 0.0e+00
  |f(x) - f(x')|         = 6.59e-14 ≰ 0.0e+00
  |f(x) - f(x')|/|f(x')| = 1.20e+03 ≰ 0.0e+00
  |g(x)|                 = 2.33e-09 ≤ 1.0e-08

* Work counters
  Seconds run:   0  (vs limit Inf)
  Iterations:    16
  f(x) calls:    53
  ∇f(x) calls:   53

To get information on the keywords used to construct method instances, use the Julia REPL help prompt (?)

help?> LBFGS
search: LBFGS

     LBFGS
    ≡≡≡≡≡≡≡

     Constructor
    =============

  LBFGS(; m::Integer = 10,
  alphaguess = LineSearches.InitialStatic(),
  linesearch = LineSearches.HagerZhang(),
  P=nothing,
  precondprep = (P, x) -> nothing,
  manifold = Flat(),
  scaleinvH0::Bool = true && (typeof(P) <: Nothing))

  LBFGS has two special keywords; the memory length m, and
  the scaleinvH0 flag. The memory length determines how many
  previous Hessian approximations to store. When scaleinvH0
  == true, then the initial guess in the two-loop recursion
  to approximate the inverse Hessian is the scaled identity,
  as can be found in Nocedal and Wright (2nd edition) (sec.
  7.2).

  In addition, LBFGS supports preconditioning via the P and
  precondprep keywords.

     Description
    =============

  The LBFGS method implements the limited-memory BFGS
  algorithm as described in Nocedal and Wright (sec. 7.2,
  2006) and original paper by Liu & Nocedal (1989). It is a
  quasi-Newton method that updates an approximation to the
  Hessian using past approximations as well as the gradient.

     References
    ============

    •    Wright, S. J. and J. Nocedal (2006), Numerical
        optimization, 2nd edition. Springer

    •    Liu, D. C. and Nocedal, J. (1989). "On the
        Limited Memory Method for Large Scale
        Optimization". Mathematical Programming B. 45
        (3): 503–528

Documentation

For more details and options, see the documentation

  • STABLE — most recently tagged version of the documentation.
  • LATEST — in-development version of the documentation.

Installation

The package is a registered package, and can be installed with Pkg.add.

julia> using Pkg; Pkg.add("Optim")

or through the pkg REPL mode by typing

] add Optim

Citation

If you use Optim.jl in your work, please cite the following.

@article{mogensen2018optim,
  author  = {Mogensen, Patrick Kofod and Riseth, Asbj{\o}rn Nilsen},
  title   = {Optim: A mathematical optimization package for {Julia}},
  journal = {Journal of Open Source Software},
  year    = {2018},
  volume  = {3},
  number  = {24},
  pages   = {615},
  doi     = {10.21105/joss.00615}
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].