All Projects → jump-dev → DiffOpt.jl

jump-dev / DiffOpt.jl

Licence: MIT license
Differentiating convex optimization programs w.r.t. program parameters

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to DiffOpt.jl

adversarial-code-generation
Source code for the ICLR 2021 work "Generating Adversarial Computer Programs using Optimized Obfuscations"
Stars: ✭ 16 (-84.91%)
Mutual labels:  differentiable-programming
human-memory
Course materials for Dartmouth course: Human Memory (PSYC 51.09)
Stars: ✭ 239 (+125.47%)
Mutual labels:  mathematical-modelling
SimInf
A framework for data-driven stochastic disease spread simulations
Stars: ✭ 21 (-80.19%)
Mutual labels:  mathematical-modelling
jaxdf
A JAX-based research framework for writing differentiable numerical simulators with arbitrary discretizations
Stars: ✭ 50 (-52.83%)
Mutual labels:  differentiable-programming
Difftaichi
10 differentiable physical simulators built with Taichi differentiable programming (DiffTaichi, ICLR 2020)
Stars: ✭ 2,024 (+1809.43%)
Mutual labels:  differentiable-programming
EpiModelHIV
Network Models of HIV Transmission Dynamics among MSM and Heterosexuals
Stars: ✭ 20 (-81.13%)
Mutual labels:  mathematical-modelling
BifurcationInference.jl
learning state-space targets in dynamical systems
Stars: ✭ 24 (-77.36%)
Mutual labels:  differentiable-programming
Mathematical-Modeling
A sharing of the learning process of mathematical modeling 数学建模常用工具模型算法分享:数学建模竞赛优秀论文,数学建模常用算法模型,LaTeX论文模板,SPSS工具分享。
Stars: ✭ 30 (-71.7%)
Mutual labels:  mathematical-modelling
Taichi
Parallel programming for everyone.
Stars: ✭ 17,625 (+16527.36%)
Mutual labels:  differentiable-programming
pomp
R package for statistical inference using partially observed Markov processes
Stars: ✭ 88 (-16.98%)
Mutual labels:  mathematical-modelling
Quadrature.jl
A common interface for quadrature and numerical integration for the SciML scientific machine learning organization
Stars: ✭ 83 (-21.7%)
Mutual labels:  differentiable-programming
Gen.jl
A general-purpose probabilistic programming system with programmable inference
Stars: ✭ 1,595 (+1404.72%)
Mutual labels:  differentiable-programming
gammy
🐙 Generalized additive models in Python with a Bayesian twist
Stars: ✭ 65 (-38.68%)
Mutual labels:  mathematical-modelling
Teg
A differentiable programming language with an integration primitive that soundly handles interactions among the derivative, integral, and discontinuities.
Stars: ✭ 25 (-76.42%)
Mutual labels:  differentiable-programming
autodiff
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
Stars: ✭ 69 (-34.91%)
Mutual labels:  mathematical-modelling
cocp
Source code for the examples accompanying the paper "Learning convex optimization control policies."
Stars: ✭ 61 (-42.45%)
Mutual labels:  differentiable-programming
kendrick
Domain-Specific Modeling for Epidemiology
Stars: ✭ 43 (-59.43%)
Mutual labels:  mathematical-modelling
point-process-nets
Point processes backed by neural net intensity models
Stars: ✭ 39 (-63.21%)
Mutual labels:  mathematical-modelling
RS-MET
Codebase for RS-MET products (Robin Schmidt's Music Engineering Tools)
Stars: ✭ 32 (-69.81%)
Mutual labels:  mathematical-modelling
point-process-rust
Simulation of point processes in the Rust programming language
Stars: ✭ 32 (-69.81%)
Mutual labels:  mathematical-modelling

DiffOpt.jl

Stable Dev Build Status Coverage

DiffOpt is a package for differentiating convex optimization programs with respect to the program parameters. It currently supports linear, quadratic and conic programs. Refer to the documentation for examples. Powered by JuMP.jl, DiffOpt allows creating a differentiable optimization model from many existing optimizers.

Installation

DiffOpt can be installed via the Julia package manager:

julia> ]
(v1.7) pkg> add DiffOpt

Example

  1. Create a model using the wrapper.
using JuMP
import DiffOpt
import HiGHS

model = JuMP.Model(() -> DiffOpt.diff_optimizer(HiGHS.Optimizer))
  1. Define your model and solve it a single line.
@variable(model, x)
@constraint(
  model,
  cons,
  x >= 3,
)
@objective(
  model,
  Min,
  2x,
)

optimize!(model) # solve
  1. Choose the problem parameters to differentiate with and set their perturbations.
MOI.set.(  # set pertubations / gradient inputs
    model, 
    DiffOpt.ReverseVariablePrimal(),
    x,
    1.0,
)
  1. Differentiate the model (primal, dual variables specifically) and fetch the gradients
DiffOpt.reverse_differentiate!(model) # differentiate

grad_exp = MOI.get(   # -3 x - 1
    model,
    DiffOpt.ReverseConstraintFunction(),
    cons
)
JuMP.constant(grad_exp)  # -1
JuMP.coefficient(grad_exp, x)  # -3

Note

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].