All Projects → pierreablin → autoptim

pierreablin / autoptim

Licence: MIT license
Automatic differentiation + optimization

Programming Languages

python
139335 projects - #7 most used programming language

Projects that are alternatives of or similar to autoptim

MissionImpossible
A concise C++17 implementation of automatic differentiation (operator overloading)
Stars: ✭ 18 (-82.35%)
Mutual labels:  autodiff
easytorch
基于Python的numpy实现的简易深度学习框架,包括自动求导、优化器、layer等的实现。
Stars: ✭ 76 (-25.49%)
Mutual labels:  autodiff
Tangent
Source-to-Source Debuggable Derivatives in Pure Python
Stars: ✭ 2,209 (+2065.69%)
Mutual labels:  autodiff
Torsten
library of C++ functions that support applications of Stan in Pharmacometrics
Stars: ✭ 38 (-62.75%)
Mutual labels:  autodiff
Nabla.jl
A operator overloading, tape-based, reverse-mode AD
Stars: ✭ 54 (-47.06%)
Mutual labels:  autodiff
autograd-gamma
NotImplementedError: VJP of gammainc wrt argnum 0 not defined
Stars: ✭ 15 (-85.29%)
Mutual labels:  autodiff
autodiff
A .NET library that provides fast, accurate and automatic differentiation (computes derivative / gradient) of mathematical functions.
Stars: ✭ 69 (-32.35%)
Mutual labels:  autodiff
autodiff
200行写一个自动微分工具
Stars: ✭ 37 (-63.73%)
Mutual labels:  autodiff
buildTensorflow
A lightweight deep learning framework made with ❤️
Stars: ✭ 28 (-72.55%)
Mutual labels:  autodiff
sunode
Solve ODEs fast, with support for PyMC
Stars: ✭ 67 (-34.31%)
Mutual labels:  autodiff

autoptim: automatic differentiation + optimization

Do you have a new machine learning model that you want to optimize, and do not want to bother computing the gradients? Autoptim is for you.

Warning:

As of version 0.3, pytorch has been replaced with autograd for automatic differentiation. It makes the interfacing with numpy even simpler.

Short presentation

Autoptim is a small Python package that blends autograd's automatic differentiation in scipy.optimize.minimize.

The gradients are computed under the hood using automatic differentiation; the user only provides the objective function:

import numpy as np
from autoptim import minimize


def rosenbrock(x):
    return (1 - x[0]) ** 2 + 100 * (x[1] - x[0] ** 2) ** 2


x0 = np.zeros(2)

x_min, _ = minimize(rosenbrock, x0)
print(x_min)

>>> [0.99999913 0.99999825]

It comes with the following features:

  • Natural interfacing with Numpy: The objective function is written in standard Numpy. The input/ output of autoptim.minimize are Numpy arrays.

  • Smart input processing: scipy.optimize.minimize is only meant to deal with one-dimensional arrays as input. In autoptim, variables can be multi-dimensional arrays or lists of arrays.

  • Preconditioning: Preconditioning is a simple way to accelerate minimization, by doing a change of variables. autoptim makes preconditioning straightforward.

Disclaimer

This package is meant to be as easy to use as possible. As so, some compromises on the speed of minimization are made.

Installation

To install, use pip:

pip install autoptim

Dependencies

  • numpy>=1.12
  • scipy>=0.18.0
  • autograd >= 1.2

Examples

Several examples can be found in autoptim/tutorials

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].