All Projects → ohinder → OnePhase.jl

ohinder / OnePhase.jl

Licence: other
This package is the implementation of a one-phase interior point method that finds KKT points of nonconvex optimization problems.

Programming Languages

julia
2034 projects
shell
77523 projects

Projects that are alternatives of or similar to OnePhase.jl

cplex-scala
A scala library for IBM ILOG CPLEX
Stars: ✭ 20 (+11.11%)
Mutual labels:  mathematical-programming
MIRT.jl
MIRT: Michigan Image Reconstruction Toolbox (Julia version)
Stars: ✭ 80 (+344.44%)
Mutual labels:  optimization-algorithms
sopt
sopt:A simple python optimization library
Stars: ✭ 42 (+133.33%)
Mutual labels:  optimization-algorithms
sdaopt
Simulated Dual Annealing for python and benchmarks
Stars: ✭ 15 (-16.67%)
Mutual labels:  optimization-algorithms
psopy
A SciPy compatible super fast Python implementation for Particle Swarm Optimization.
Stars: ✭ 33 (+83.33%)
Mutual labels:  optimization-algorithms
AuxiLearn
Official implementation of Auxiliary Learning by Implicit Differentiation [ICLR 2021]
Stars: ✭ 71 (+294.44%)
Mutual labels:  optimization-algorithms
Aleph star
Reinforcement learning with A* and a deep heuristic
Stars: ✭ 235 (+1205.56%)
Mutual labels:  optimization-algorithms
pdfo
Powell's Derivative-Free Optimization solvers
Stars: ✭ 56 (+211.11%)
Mutual labels:  optimization-algorithms
Harris-Hawks-Optimization-Algorithm-and-Applications
Source codes for HHO paper: Harris hawks optimization: Algorithm and applications: https://www.sciencedirect.com/science/article/pii/S0167739X18313530. In this paper, a novel population-based, nature-inspired optimization paradigm is proposed, which is called Harris Hawks Optimizer (HHO).
Stars: ✭ 31 (+72.22%)
Mutual labels:  optimization-algorithms
benchopt
Making your benchmark of optimization algorithms simple and open
Stars: ✭ 89 (+394.44%)
Mutual labels:  optimization-algorithms
cspy
A collection of algorithms for the (Resource) Constrained Shortest Path problem in Python / C++ / C#
Stars: ✭ 64 (+255.56%)
Mutual labels:  optimization-algorithms
galini
An extensible MINLP solver
Stars: ✭ 29 (+61.11%)
Mutual labels:  mathematical-programming
Nature-Inspired-Algorithms
Sample Code Collection of Nature-Inspired Computational Methods
Stars: ✭ 22 (+22.22%)
Mutual labels:  optimization-algorithms
pybnb
A parallel branch-and-bound engine for Python. (https://pybnb.readthedocs.io/)
Stars: ✭ 53 (+194.44%)
Mutual labels:  optimization-algorithms
psgd tf
Tensorflow implementation of preconditioned stochastic gradient descent
Stars: ✭ 33 (+83.33%)
Mutual labels:  optimization-algorithms
pallas-solver
Global optimization algorithms written in C++
Stars: ✭ 43 (+138.89%)
Mutual labels:  optimization-algorithms
optaplanner-quickstarts
OptaPlanner quick starts for AI optimization: many use cases shown in many different technologies.
Stars: ✭ 226 (+1155.56%)
Mutual labels:  optimization-algorithms
GurobiLink
Wolfram Language interface to the Gurobi numerical optimization library
Stars: ✭ 16 (-11.11%)
Mutual labels:  optimization-algorithms
GDLibrary
Matlab library for gradient descent algorithms: Version 1.0.1
Stars: ✭ 50 (+177.78%)
Mutual labels:  optimization-algorithms
neural-net-optimization
PyTorch implementations of recent optimization algorithms for deep learning.
Stars: ✭ 59 (+227.78%)
Mutual labels:  optimization-algorithms

A one-phase interior point method for nonconvex optimization

This package is the implementation of a one-phase interior point method that finds KKT points of optimization problems of the form:

min f(x) s.t. a(x) < 0

where the functions f : R^n -> R and a : R^n -> R^m are twice differentiable. The one-phase algorithm also handles bound constraints and nonlinear equalities.

Currently, the package is in development. Although you are welcome to try it out. Please let me know if there are any bugs etc. Note that the code is generally significantly slower than Ipopt in terms of raw runtime, particularly on small problems (the iteration count is competitive). However, we recommend trying our one-phase IPM if: Ipopt is failing to solve, the problem is very large or might be infeasible.

How to install

Open the package manager (type "]" in the console) then write

add https://github.com/ohinder/advanced_timer.jl
add https://github.com/ohinder/OnePhase.git
add NLPModels@0.18.1
add JuMP@0.22.3
add NLPModelsJuMP@0.9.1
add MathOptInterface@0.10.7
test OnePhase

How to use with JuMP

Here is a simple example where a JuMP model is passed to the one-phase solver

using OnePhase, JuMP

m = Model()
set_optimizer(m, OnePhase.OnePhaseSolver)
@variable(m, x, start=-3)
@objective(m, Min, x)
@NLconstraint(m, x^2 >= 1.0)
@NLconstraint(m, x >= -1.0)

status = optimize!(m)

Example using CUTEst

Install CUTEst then run

using OnePhase, CUTEst
nlp = CUTEstModel("CHAIN")
iter, status, hist, t, err, timer = one_phase_solve(nlp);
@show get_original_x(iter) # gives the primal solution of the solver
@show get_y(iter) # gives the dual solution of the solver

Feedback?

If you have found some bug or think there is someway I can improve the code feel free to contact me! My webpage is https://www.oliverhinder.com/.

Solver output

it = iteration number

step = step type, either stabilization step (s) or aggressive step (a)

eta = targeted reduction in feasibility/barrier parameter, eta=1 for stabilization steps eta<1 for aggressive steps

α_P = primal step size

α_D = dual step size

ls = number of points trialled during line search

|dx| = infinity norm of primal direction size

|dy| = infinity norm of dual direction size

N err = relative error in linear system solves.

mu = value of barrier parameter

dual = gradient of lagragian scaled by largest dual variable

primal = error in primal feasibility

cmp scaled = | Sy |/(1 + |y|)

inf = how close to infeasible problem is, values close to zero indicate infeasibility

delta = size of perturbation

#fac = number of factorizations (split into two numbers -- first is how many factorization needed to ensure primal schur complement is positive definite, second number represents total number of factorizations including any increases in delta to avoid issues when direction quality is very poor)

|x| = infinity norm of x variables

|y| = infinity norm of y variables

∇phi = gradient of log barrier

phi = value of log barrier

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].