All Projects β†’ lanl-ansi β†’ Juniper.jl

lanl-ansi / Juniper.jl

Licence: mit
A JuMP-based Nonlinear Integer Program Solver

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Juniper.jl

Beautiful React Redux
Redux πŸš€, Redux 🀘, Redux πŸ”₯ - and the magic optimization
Stars: ✭ 87 (-15.53%)
Mutual labels:  optimization
Proximaloperators.jl
Proximal operators for nonsmooth optimization in Julia
Stars: ✭ 92 (-10.68%)
Mutual labels:  optimization
Dists
IQA: Deep Image Structure and Texture Similarity Metric
Stars: ✭ 101 (-1.94%)
Mutual labels:  optimization
Safeopt
Safe Bayesian Optimization
Stars: ✭ 90 (-12.62%)
Mutual labels:  optimization
Deep Learning Drizzle
Drench yourself in Deep Learning, Reinforcement Learning, Machine Learning, Computer Vision, and NLP by learning from these exciting lectures!!
Stars: ✭ 9,717 (+9333.98%)
Mutual labels:  optimization
Limes
Link Discovery Framework for Metric Spaces.
Stars: ✭ 94 (-8.74%)
Mutual labels:  optimization
Auxpack
A dashboard for monitoring Webpack build stats.
Stars: ✭ 86 (-16.5%)
Mutual labels:  optimization
Node Finance
Module for portfolio optimization, prices and options
Stars: ✭ 101 (-1.94%)
Mutual labels:  optimization
Rinsim
RinSim is a logistics simulator written in Java. RinSim supports (de)centralized algorithms for dynamic pickup-and-delivery problems (PDP). The simulator is developed at the imec-DistriNet group at the dept. of Computer Science, KU Leuven, Belgium.
Stars: ✭ 91 (-11.65%)
Mutual labels:  optimization
Advisor
Open-source implementation of Google Vizier for hyper parameters tuning
Stars: ✭ 1,359 (+1219.42%)
Mutual labels:  optimization
Jplusone
Tool for automatic detection and asserting "N+1 SELECT problem" occurences in JPA based Spring Boot Java applications and finding origin of JPA issued SQL statements in general
Stars: ✭ 91 (-11.65%)
Mutual labels:  optimization
Go Perfbook
Thoughts on Go performance optimization
Stars: ✭ 9,597 (+9217.48%)
Mutual labels:  optimization
Qreverse
A small study in hardware accelerated AoS reversal
Stars: ✭ 97 (-5.83%)
Mutual labels:  optimization
Quantum Learning
This repository contains the source code used to produce the results presented in the paper "Machine learning method for state preparation and gate synthesis on photonic quantum computers".
Stars: ✭ 89 (-13.59%)
Mutual labels:  optimization
Unreachable
Unreachable code path optimization hint for Swift
Stars: ✭ 101 (-1.94%)
Mutual labels:  optimization
Csso Rails
CSS Optimizer(csso) ruby wrapper for Rails Asset pipeline
Stars: ✭ 86 (-16.5%)
Mutual labels:  optimization
Webpack Conditional Loader
C conditionals directive for JavaScript
Stars: ✭ 93 (-9.71%)
Mutual labels:  optimization
Jump.jl
Modeling language for Mathematical Optimization (linear, mixed-integer, conic, semidefinite, nonlinear)
Stars: ✭ 1,383 (+1242.72%)
Mutual labels:  optimization
Alpine.jl
A JuMP-based Global Optimization Solver for Non-convex Programs
Stars: ✭ 101 (-1.94%)
Mutual labels:  optimization
Monkeys
A strongly-typed genetic programming framework for Python
Stars: ✭ 98 (-4.85%)
Mutual labels:  optimization

Juniper

Status: CI codecov Documentation

Idea

You have a non linear problem with discrete variables (MINLP) and want some more control over the branch and bound part. The relaxation should be solveable by any solver you prefer. Some solvers might not be able to solve the mixed integer part by themselves.

Juniper (Jump Nonlinear Integer Program solver) is a heuristic for non convex problems. You need the global optimum? Check out Alpine.jl

Basic usage

Juniper can be installed via:

] add Juniper for Julia v1.

Then adding it to your project by

using Juniper

You also have to import your NLP solver i.e.

using Ipopt

as well as JuMP

Define Juniper as the optimizer:

optimizer = Juniper.Optimizer
nl_solver = optimizer_with_attributes(Ipopt.Optimizer, "print_level"=>0)

And give it a go:

using LinearAlgebra # for the dot product
m = Model(optimizer_with_attributes(optimizer, "nl_solver"=>nl_solver))

v = [10,20,12,23,42]
w = [12,45,12,22,21]
@variable(m, x[1:5], Bin)

@objective(m, Max, dot(v,x))

@NLconstraint(m, sum(w[i]*x[i]^2 for i=1:5) <= 45)   

optimize!(m)

# retrieve the objective value, corresponding x values and the status
println(JuMP.value.(x))
println(JuMP.objective_value(m))
println(JuMP.termination_status(m))

This solver is a NLP solver therefore you should have at least one NLconstraint or NLobjective.

It is recommended to specify a mip solver as well i.e.

using Cbc
optimizer = Juniper.Optimizer
nl_solver= optimizer_with_attributes(Ipopt.Optimizer, "print_level" => 0)
mip_solver = optimizer_with_attributes(Cbc.Optimizer, "logLevel" => 0)
m = Model(optimizer_with_attributes(optimizer, "nl_solver"=>nl_solver, "mip_solver"=>mip_solver))

Then the feasibility pump is used to find a feasible solution before the branch and bound part starts. This turned out to be highly effective.

Citing Juniper

If you find Juniper useful in your work, we kindly request that you cite the following paper or technical report:

@inproceedings{juniper,
     Author = {Ole KrΓΆger and Carleton Coffrin and Hassan Hijazi and Harsha Nagarajan},
     Title = {Juniper: An Open-Source Nonlinear Branch-and-Bound Solver in Julia},
     booktitle="Integration of Constraint Programming, Artificial Intelligence, and Operations Research",
     pages="377--386",
     year="2018",
     publisher="Springer International Publishing",
     isbn="978-3-319-93031-2"
}
Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].