All Projects → SciML → Galacticoptim.jl

SciML / Galacticoptim.jl

Licence: mit
Local, global, and beyond optimization for scientific machine learning (SciML)

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Galacticoptim.jl

dopt
A numerical optimisation and deep learning framework for D.
Stars: ✭ 28 (-81.94%)
Mutual labels:  optimization, automatic-differentiation
Aerosandbox
Aircraft design optimization made fast through modern automatic differentiation. Plug-and-play analysis tools for aerodynamics, propulsion, structures, trajectory design, and much, much more.
Stars: ✭ 193 (+24.52%)
Mutual labels:  automatic-differentiation, optimization
Owl
Owl - OCaml Scientific and Engineering Computing @ http://ocaml.xyz
Stars: ✭ 919 (+492.9%)
Mutual labels:  automatic-differentiation, optimization
AbstractOperators.jl
Abstract operators for large scale optimization in Julia
Stars: ✭ 26 (-83.23%)
Mutual labels:  optimization, automatic-differentiation
autodiff
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
Stars: ✭ 69 (-55.48%)
Mutual labels:  optimization, automatic-differentiation
Pennylane
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
Stars: ✭ 800 (+416.13%)
Mutual labels:  automatic-differentiation, optimization
Adcme.jl
Automatic Differentiation Library for Computational and Mathematical Engineering
Stars: ✭ 106 (-31.61%)
Mutual labels:  automatic-differentiation, optimization
Soot
Soot - A Java optimization framework
Stars: ✭ 2,049 (+1221.94%)
Mutual labels:  optimization
Aesara
Aesara is a fork of the Theano library that is maintained by the PyMC developers. It was previously named Theano-PyMC.
Stars: ✭ 145 (-6.45%)
Mutual labels:  automatic-differentiation
E2e Model Learning
Task-based end-to-end model learning in stochastic optimization
Stars: ✭ 140 (-9.68%)
Mutual labels:  optimization
Pysot
Surrogate Optimization Toolbox for Python
Stars: ✭ 136 (-12.26%)
Mutual labels:  optimization
Nlopt.jl
Package to call the NLopt nonlinear-optimization library from the Julia language
Stars: ✭ 141 (-9.03%)
Mutual labels:  optimization
Evalml
EvalML is an AutoML library written in python.
Stars: ✭ 145 (-6.45%)
Mutual labels:  optimization
Ltecleanerfoss
The last Android cleaner you'll ever need!
Stars: ✭ 141 (-9.03%)
Mutual labels:  optimization
Yopo You Only Propagate Once
Code for our nips19 paper: You Only Propagate Once: Accelerating Adversarial Training Via Maximal Principle
Stars: ✭ 152 (-1.94%)
Mutual labels:  optimization
Pygmo2
A Python platform to perform parallel computations of optimisation tasks (global and local) via the asynchronous generalized island model.
Stars: ✭ 134 (-13.55%)
Mutual labels:  optimization
Swissarmylib
Collection of helpful utilities we use in our Unity projects.
Stars: ✭ 154 (-0.65%)
Mutual labels:  optimization
Webdnn
The Fastest DNN Running Framework on Web Browser
Stars: ✭ 1,850 (+1093.55%)
Mutual labels:  optimization
Deep Learning Specialization Coursera
Deep Learning Specialization courses by Andrew Ng, deeplearning.ai
Stars: ✭ 146 (-5.81%)
Mutual labels:  optimization
Fantasy Basketball
Scraping statistics, predicting NBA player performance with neural networks and boosting algorithms, and optimising lineups for Draft Kings with genetic algorithm. Capstone Project for Machine Learning Engineer Nanodegree by Udacity.
Stars: ✭ 146 (-5.81%)
Mutual labels:  optimization

GalacticOptim.jl

Build Status Stable Dev

GalacticOptim.jl is a package with a scope that is beyond your normal global optimization package. GalacticOptim.jl seeks to bring together all of the optimization packages it can find, local and global, into one unified Julia interface. This means, you learn one package and you learn them all! GalacticOptim.jl adds a few high-level features, such as integrating with automatic differentiation, to make its usage fairly simple for most cases, while allowing all of the options in a single unified interface.

Note: This package is still in active development.

Installation

Assuming that you already have Julia correctly installed, it suffices to import GalacticOptim.jl in the standard way:

import Pkg; Pkg.add("GalacticOptim")

The packages relevant to the core functionality of GalacticOptim.jl will be imported accordingly and, in most cases, you do not have to worry about the manual installation of dependencies. Below is the list of packages that need to be installed explicitly if you intend to use the specific optimization algorithms offered by them:

Tutorials and Documentation

For information on using the package, see the stable documentation. Use the in-development documentation for the version of the documentation, which contains the unreleased features.

Examples

 using GalacticOptim, Optim
 rosenbrock(x,p) =  (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2
 x0 = zeros(2)
 p  = [1.0,100.0]

 prob = OptimizationProblem(rosenbrock,x0,p)
 sol = solve(prob,NelderMead())


 using BlackBoxOptim
 prob = OptimizationProblem(rosenbrock, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.0])
 sol = solve(prob,BBO())

Note that Optim.jl is a core dependency of GalaticOptim.jl. However, BlackBoxOptim.jl is not and must already be installed (see the list above).

Warning: The output of the second optimization task (BBO()) is currently misleading in the sense that it returns Status: failure (reached maximum number of iterations). However, convergence is actually reached and the confusing message stems from the reliance on the Optim.jl output struct (where the situation of reaching the maximum number of iterations is rightly regarded as a failure). The improved output struct will soon be implemented.

The output of the first optimization task (with the NelderMead() algorithm) is given below:

* Status: success

* Candidate solution
   Final objective value:     3.525527e-09

* Found with
   Algorithm:     Nelder-Mead

* Convergence measures
   (Σ(yᵢ-ȳ)²)/n  1.0e-08

* Work counters
   Seconds run:   0  (vs limit Inf)
   Iterations:    60
   f(x) calls:    118

We can also explore other methods in a similar way:

 f = OptimizationFunction(rosenbrock, GalacticOptim.AutoForwardDiff())
 prob = OptimizationProblem(f, x0, p)
 sol = solve(prob,BFGS())

For instance, the above optimization task produces the following output:

* Status: success

* Candidate solution
   Final objective value:     7.645684e-21

* Found with
   Algorithm:     BFGS

* Convergence measures
   |x - x'|               = 3.48e-07  0.0e+00
   |x - x'|/|x'|          = 3.48e-07  0.0e+00
   |f(x) - f(x')|         = 6.91e-14  0.0e+00
   |f(x) - f(x')|/|f(x')| = 9.03e+06  0.0e+00
   |g(x)|                 = 2.32e-09  1.0e-08

* Work counters
   Seconds run:   0  (vs limit Inf)
   Iterations:    16
   f(x) calls:    53
   ∇f(x) calls:   53
 prob = OptimizationProblem(f, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.0])
 sol = solve(prob, Fminbox(GradientDescent()))

The examples clearly demonstrate that GalacticOptim.jl provides an intuitive way of specifying optimization tasks and offers a relatively easy access to a wide range of optimization algorithms.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].