All Projects → PSORLab → EAGO.jl

PSORLab / EAGO.jl

Licence: other
A development environment for robust and global optimization

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to EAGO.jl

biteopt
Derivative-Free Optimization Method for Global Optimization (C++)
Stars: ✭ 91 (-14.15%)
Mutual labels:  optimization-methods, optimization-tools
zoofs
zoofs is a python library for performing feature selection using a variety of nature-inspired wrapper algorithms. The algorithms range from swarm-intelligence to physics-based to Evolutionary. It's easy to use , flexible and powerful tool to reduce your feature size.
Stars: ✭ 142 (+33.96%)
Mutual labels:  optimization-methods, optimization-tools
soar-php
SQL optimizer and rewriter. - SQL 优化、重写器(辅助 SQL 调优)。
Stars: ✭ 140 (+32.08%)
Mutual labels:  optimizer, optimization-tools
artificial-neural-variability-for-deep-learning
The PyTorch Implementation of Variable Optimizers/ Neural Variable Risk Minimization proposed in our Neural Computation paper: Artificial Neural Variability for Deep Learning: On overfitting, Noise Memorization, and Catastrophic Forgetting.
Stars: ✭ 34 (-67.92%)
Mutual labels:  optimizer
MINLPLib.jl
A JuMP-based library of Non-Linear and Mixed-Integer Non-Linear Programs
Stars: ✭ 30 (-71.7%)
Mutual labels:  global-optimization
LAMB Optimizer TF
LAMB Optimizer for Large Batch Training (TensorFlow version)
Stars: ✭ 119 (+12.26%)
Mutual labels:  optimizer
keras gradient noise
Add gradient noise to any Keras optimizer
Stars: ✭ 36 (-66.04%)
Mutual labels:  optimizer
AshBF
Over-engineered Brainfuck optimizing compiler and interpreter
Stars: ✭ 14 (-86.79%)
Mutual labels:  optimizer
Bonobo.jl
A general branch and bound framework
Stars: ✭ 26 (-75.47%)
Mutual labels:  optimization-tools
advchain
[Medical Image Analysis] Adversarial Data Augmentation with Chained Transformations (AdvChain)
Stars: ✭ 32 (-69.81%)
Mutual labels:  robust-optimization
OptimisationAlgorithms
Searching global optima with firefly algorithm and solving traveling salesmen problem with genetic algorithm
Stars: ✭ 20 (-81.13%)
Mutual labels:  global-optimization
prediction gan
PyTorch Impl. of Prediction Optimizer (to stabilize GAN training)
Stars: ✭ 31 (-70.75%)
Mutual labels:  optimizer
gpuvmem
GPU Framework for Radio Astronomical Image Synthesis
Stars: ✭ 27 (-74.53%)
Mutual labels:  optimization-methods
Optimizers-for-Tensorflow
Adam, NAdam and AAdam optimizers
Stars: ✭ 20 (-81.13%)
Mutual labels:  optimizer
horoscope
horoscope is an optimizer inspector for DBMS.
Stars: ✭ 34 (-67.92%)
Mutual labels:  optimizer
robust
Robust optimization for power markets
Stars: ✭ 27 (-74.53%)
Mutual labels:  robust-optimization
pestpp
tools for scalable and non-intrusive parameter estimation, uncertainty analysis and sensitivity analysis
Stars: ✭ 90 (-15.09%)
Mutual labels:  optimization-tools
MIPT-Opt
A course on Optimization Methods
Stars: ✭ 128 (+20.75%)
Mutual labels:  optimization-methods
lbfgsb-gpu
An open source library for the GPU-implementation of L-BFGS-B algorithm
Stars: ✭ 70 (-33.96%)
Mutual labels:  optimization-methods
Cleaner
The only storage saving app that actually works! :D
Stars: ✭ 27 (-74.53%)
Mutual labels:  optimizer

EAGO: Easy-Advanced Global Optimization

EAGO is an open-source development environment for robust and global optimization in Julia.

Documentation Linux/OS/Windows Persistent DOI
Build Status DOI
Coverage Chat
codecov Join the chat at https://gitter.im/EAGODevelopment

EAGO's Optimizer Capabilities

EAGO is a deterministic global optimizer designed to address a wide variety of optimization problems, emphasizing nonlinear programs (NLPs), by propagating McCormick relaxations along the factorable structure of each expression in the NLP. Most operators supported by modern automatic differentiation (AD) packages (e.g. +, sin, cosh) are supported by EAGO and a number utilities for sanitizing native Julia code and generating relaxations on a wide variety of user-defined functions have been included. Currently, EAGO supports problems that have a priori variable bounds defined and have differentiable constraints. That is, problems should be specified in the generic form below:

EAGO's Relaxations

For each nonlinear term, EAGO makes use of factorable representations to construct bounds and relaxations. In the case of F = y(y-5)sin(y), a list is generated and rules for constructing McCormick relaxations are used to formulate relaxations in the original Y decision space1:

  • v1 = y
  • v2 = v1 - 5
  • v3 = sin(v1)
  • v4 = v1v2
  • v5 = v4v3
  • F = v5

Either these original relaxations, differentiable McCormick relaxations2, or affine relaxations thereof can be used to construct relaxations of optimization problems useful in branch and bound routines for global optimization. Utilities are included to combine these with algorithms for relaxing implicit functions3 and forward-reverse propagation of McCormick arithmetic4.

Sample Usage

EAGO makes use of the JuMP algebraic modeling language to improve the user's experience in setting up optimization models. Consider the familiar "process" problem instance5:

This model can be formulated using JuMP code as:

using JuMP, EAGO

m = Model(EAGO.Optimizer)

# Define bounded variables
xL = [10.0; 0.0; 0.0; 0.0; 0.0; 85.0; 90.0; 3.0; 1.2; 145.0]
xU = [2000.0; 16000.0; 120.0; 5000.0; 2000.0; 93.0; 95.0; 12.0; 4.0; 162.0]
@variable(m, xL[i] <= x[i=1:10] <= xU[i])

# Define nonlinear constraints
@NLconstraint(m, e1, -x[1]*(1.12+0.13167*x[8]-0.00667* (x[8])^2)+x[4] == 0.0)
@NLconstraint(m, e3, -0.001*x[4]*x[9]*x[6]/(98-x[6])+x[3] == 0.0)
@NLconstraint(m, e4, -(1.098*x[8]-0.038* (x[8])^2)-0.325*x[6]+x[7] == 57.425)
@NLconstraint(m, e5, -(x[2]+x[5])/x[1]+x[8] == 0.0)

# Define linear constraints
@constraint(m, e2, -x[1]+1.22*x[4]-x[5] == 0.0)
@constraint(m, e6, x[9]+0.222*x[10] == 35.82)
@constraint(m, e7, -3*x[7]+x[10] == -133.0)

# Define nonlinear objective
@NLobjective(m, Max, 0.063*x[4]*x[7] - 5.04*x[1] - 0.035*x[2] - 10*x[3] - 3.36*x[5])

# Solve the optimization problem
JuMP.optimize!(m)

Special handling has been included for linear/quadratic functions defined using the @constraint macro in JuMP and these can generally be expected to perform better than specifying quadratic or linear terms with the @NLconstraint macro.

A Cautionary Note on Global Optimization

As a global optimization platform, EAGO's solvers can be used to find solutions of general nonconvex problems with a guaranteed certificate of optimality. However, global solvers suffer from the curse of dimensionality and therefore their performance is outstripped by convex/local solvers. For users interested in large-scale applications, be warned that problems generally larger than a few variables may prove challenging for certain types of global optimization problems.

Package Capabilities

The EAGO package has numerous features: a solver accessible from JuMP/MathOptInterface, domain reduction routines, McCormick relaxations, and specialized non-convex semi-infinite program solvers. A full description of all EAGO features is available in the documentation website. A series of example have been provided in the form of Jupyter notebooks in the separate EAGO-notebooks repository.

Recent News

  • 2/5/2021: EAGO v0.6.0 has been tagged.
    • License changed from CC BY-NC-SA 4.0 to MIT
    • Fix deprecated Ipopt constructor
    • Fix discrepancy between the returned objective value and the objective evaluated at the solution.
    • Dramatically decrease allocates and first-run performance of SIP routines.
    • Add two algorithms which modify SIPres detailed in Djelassi, H. and Mitsos A. 2017.
    • Fix objective interval fallback function.
    • New SIP interface with extendable subroutines.
    • Fix x^y relaxation bug.
    • Add issues template.
    • Add SIP subroutine documentation.

For a full list of EAGO release news, see click here

Installing EAGO

EAGO is a registered Julia package and it can be installed using the Julia package manager. From the Julia REPL, type ] to enter the Pkg REPL mode and run the following command

pkg> add EAGO

Currently, EAGO is tied to a 0.19+ or greater version of JuMP. This allows a replication of some of the internal features shared by EAGO and JuMP's AD scheme aka generation of Wengert Tapes pass evaluators between JuMP and EAGO etc.

pkg> add JuMP

EAGO v0.6.0 is the current tagged version and requires Julia 1.2+ for full functionality (however Julia 1.0+ versions support partial functionality). Use with version 1.5 is recommended as the majority of in-house testing has occurred using this version of Julia. The user is directed to the High-Performance Configuration for instructions on how to install a high performance version of EAGO (rather than the basic entirely open-source version). If any issues are encountered when loading EAGO (or when using it), please submit an issue using the Github issue tracker.

Bug reporting, support and feature requests

Please report bugs or feature requests by opening an issue using the Github issue tracker. All manners of feedback are encouraged.

Current limitations

  • Nonlinear handling assumes that box-constraints of nonlinear terms are available or can be inferred from bounds-tightening.
  • Only currently supports continuous functions. Support for mixed-integer problems is forthcoming.

Work In Progress

  • Extensions for nonconvex dynamic global & robust optimization.
  • Provide support for mixed-integer problems.
  • Update EAGO to support nonsmooth problems (requires: a nonsmooth local nlp optimizer or lexiographic AD, support for relaxations is already included).
  • Performance assessment of nonlinear (differentiable) relaxations and incorporation into main EAGO routine.
  • Evaluation and incorporation of implicit relaxation routines in basic solver.

Citing EAGO

Please cite the following paper when using EAGO. In plain text form this is:

 M. E. Wilhelm & M. D. Stuber (2020) EAGO.jl: easy advanced global optimization in Julia,
 Optimization Methods and Software, DOI: 10.1080/10556788.2020.1786566

A corresponding bibtex entry text is given below and a corresponding .bib file is given in citation.bib.

@article{doi:10.1080/10556788.2020.1786566,
author = { M. E.   Wilhelm  and  M. D.   Stuber },
title = {EAGO.jl: easy advanced global optimization in Julia},
journal = {Optimization Methods and Software},
pages = {1-26},
year  = {2020},
publisher = {Taylor & Francis},
doi = {10.1080/10556788.2020.1786566},
URL = {https://doi.org/10.1080/10556788.2020.1786566},
eprint = {https://doi.org/10.1080/10556788.2020.1786566}
}

Related Packages

  • ValidatedNumerics.jl:A Julia library for validated interval calculations, including basic interval extensions, constraint programming, and interval contactors
  • MAiNGO: An open-source mixed-integer nonlinear programming package in C++ that utilizes MC++ for relaxations.
  • MC++: A mature McCormick relaxation package in C++ that also includes McCormick-Taylor, Chebyshev Polyhedral and Ellipsoidal arithmetics.

References

  1. A. Mitsos, B. Chachuat, and P. I. Barton. McCormick-based relaxations of algorithms. SIAM Journal on Optimization, 20(2):573–601, 2009.
  2. K.A. Khan, HAJ Watson, P.I. Barton. Differentiable McCormick relaxations. Journal of Global Optimization, 67(4):687-729 (2017).
  3. Stuber, M.D., Scott, J.K., Barton, P.I.: Convex and concave relaxations of implicit functions. Optim. Methods Softw. 30(3), 424–460 (2015)
  4. A., Wechsung JK Scott, HAJ Watson, and PI Barton. Reverse propagation of McCormick relaxations. Journal of Global Optimization 63(1):1-36 (2015).
  5. Bracken, Jerome and McCormick, Garth P. Selected Applications of Nonlinear Programming, John Wiley and Sons, New York, 1968.
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].