All Projects → facebookresearch → theseus

facebookresearch / theseus

Licence: MIT license
A library for differentiable nonlinear optimization

Programming Languages

python
139335 projects - #7 most used programming language
C++
36643 projects - #6 most used programming language
Cuda
1817 projects

Projects that are alternatives of or similar to theseus

Ceres Solver
A large scale non-linear optimization library
Stars: ✭ 2,180 (+73.43%)
Mutual labels:  levenberg-marquardt, gauss-newton, nonlinear-least-squares
least-squares-cpp
A single header-only C++ library for least squares fitting.
Stars: ✭ 46 (-96.34%)
Mutual labels:  levenberg-marquardt, gauss-newton
axxb calibration
A Comprehensive AX = XB Calibration Solvers in Matlab
Stars: ✭ 19 (-98.49%)
Mutual labels:  levenberg-marquardt
levenberg-marquardt
Curve fitting method in JavaScript
Stars: ✭ 63 (-94.99%)
Mutual labels:  levenberg-marquardt
nim-mpfit
A wrapper for the cMPFIT library for the Nim programming language, https://vindaar.github.io/nim-mpfit/
Stars: ✭ 18 (-98.57%)
Mutual labels:  levenberg-marquardt
stocBiO
Example code for paper "Bilevel Optimization: Nonasymptotic Analysis and Faster Algorithms"
Stars: ✭ 22 (-98.25%)
Mutual labels:  bilevel-optimization
DynAdjust
Least squares adjustment software
Stars: ✭ 43 (-96.58%)
Mutual labels:  nonlinear-least-squares
GALAHAD
A library of modern Fortran modules for nonlinear optimization
Stars: ✭ 60 (-95.23%)
Mutual labels:  nonlinear-least-squares
AuxiLearn
Official implementation of Auxiliary Learning by Implicit Differentiation [ICLR 2021]
Stars: ✭ 71 (-94.35%)
Mutual labels:  implicit-differentiation
robustnav
Evaluating pre-trained navigation agents under corruptions
Stars: ✭ 18 (-98.57%)
Mutual labels:  embodied-ai
language-planner
Official Code for "Language Models as Zero-Shot Planners: Extracting Actionable Knowledge for Embodied Agents"
Stars: ✭ 84 (-93.32%)
Mutual labels:  embodied-ai
manipulathor
ManipulaTHOR, a framework that facilitates visual manipulation of objects using a robotic arm
Stars: ✭ 64 (-94.91%)
Mutual labels:  embodied-ai

CircleCI License pypi PyPi Downloads Python pre-commit black PRs

A library for differentiable nonlinear optimization

PaperBlogWebpageTutorialsDocs

Theseus is an efficient application-agnostic library for building custom nonlinear optimization layers in PyTorch to support constructing various problems in robotics and vision as end-to-end differentiable architectures.

Differentiable nonlinear optimization provides a general scheme to encode inductive priors, as the objective function can be partly parameterized by neural models and partly with expert domain-specific differentiable models. The ability to compute gradients end-to-end is retained by differentiating through the optimizer which allows neural models to train on the final task loss, while also taking advantage of priors captured by the optimizer.


Current Features

Application agnostic interface

Our implementation provides an easy to use interface to build custom optimization layers and plug them into any neural architecture. Following differentiable features are currently available:

Efficiency based design

We support several features that improve computation times and memory consumption:

Getting Started

Prerequisites

  • We strongly recommend you install Theseus in a venv or conda environment with Python 3.8-3.10.
  • Theseus requires torch installation. To install for your particular CPU/CUDA configuration, follow the instructions in the PyTorch website.
  • For GPU support, Theseus requires nvcc to compile custom CUDA operations. Make sure it matches the version used to compile pytorch with nvcc --version. If not, install it and ensure its location is on your system's $PATH variable.
  • Theseus also requires suitesparse, which you can install via:
    • sudo apt-get install libsuitesparse-dev (Ubuntu).
    • conda install -c conda-forge suitesparse (Mac).

Installing

  • pypi

    pip install theseus-ai

    We currently provide wheels with our CUDA extensions compiled using CUDA 11.6 and Python 3.10. For other CUDA versions, consider installing from source or using our build script.

    Note that pypi installation doesn't include our experimental Theseus Labs. For this, please install from source.

  • From source

    The simplest way to install Theseus from source is by running the following (see further below to also include BaSpaCho)

    git clone https://github.com/facebookresearch/theseus.git && cd theseus
    pip install -e .

    If you are interested in contributing to Theseus, instead install

    pip install -e ".[dev]"

    and follow the more detailed instructions in CONTRIBUTING.

  • Installing BaSpaCho extensions from source

    By default, installing from source doesn't include our BaSpaCho sparse solver extension. For this, follow these steps:

    1. Compile BaSpaCho from source following instructions here. We recommend using flags -DBLA_STATIC=ON -DBUILD_SHARED_LIBS=OFF.

    2. Run

      git clone https://github.com/facebookresearch/theseus.git && cd theseus
      BASPACHO_ROOT_DIR=<path/to/root/baspacho/dir> pip install -e .

      where the BaSpaCho root dir must have the binaries in the subdirectory build.

Running unit tests (requires dev installation)

python -m pytest tests

By default, unit tests include tests for our CUDA extensions. You can add the option -m "not cudaext" to skip them when installing without CUDA support. Additionally, the tests for sparse solver BaSpaCho are automatically skipped when its extlib is not compiled.

Examples

Simple example. This example is fitting the curve $y$ to a dataset of $N$ observations $(x,y) \sim D$. This is modeled as an Objective with a single CostFunction that computes the residual $y - v e^x$. The Objective and the GaussNewton optimizer are encapsulated into a TheseusLayer. With Adam and MSE loss, $x$ is learned by differentiating through the TheseusLayer.

import torch
import theseus as th

x_true, y_true, v_true = read_data() # shapes (1, N), (1, N), (1, 1)
x = th.Variable(torch.randn_like(x_true), name="x")
y = th.Variable(y_true, name="y")
v = th.Vector(1, name="v") # a manifold subclass of Variable for optim_vars

def error_fn(optim_vars, aux_vars): # returns y - v * exp(x)
    x, y = aux_vars
    return y.tensor - optim_vars[0].tensor * torch.exp(x.tensor)

objective = th.Objective()
cost_function = th.AutoDiffCostFunction(
    [v], error_fn, y_true.shape[1], aux_vars=[x, y],
    cost_weight=th.ScaleCostWeight(1.0))
objective.add(cost_function)
layer = th.TheseusLayer(th.GaussNewton(objective, max_iterations=10))

phi = torch.nn.Parameter(x_true + 0.1 * torch.ones_like(x_true))
outer_optimizer = torch.optim.Adam([phi], lr=0.001)
for epoch in range(10):
    solution, info = layer.forward(
        input_tensors={"x": phi.clone(), "v": torch.ones(1, 1)},
        optimizer_kwargs={"backward_mode": "implicit"})
    outer_loss = torch.nn.functional.mse_loss(solution["v"], v_true)
    outer_loss.backward()
    outer_optimizer.step()

See tutorials, and robotics and vision examples to learn about the API and usage.

Citing Theseus

If you use Theseus in your work, please cite the paper with the BibTeX below.

@article{pineda2022theseus,
  title   = {{Theseus: A Library for Differentiable Nonlinear Optimization}},
  author  = {Luis Pineda and Taosha Fan and Maurizio Monge and Shobha Venkataraman and Paloma Sodhi and Ricky TQ Chen and Joseph Ortiz and Daniel DeTone and Austin Wang and Stuart Anderson and Jing Dong and Brandon Amos and Mustafa Mukadam},
  journal = {Advances in Neural Information Processing Systems},
  year    = {2022}
}

License

Theseus is MIT licensed. See the LICENSE for details.

Additional Information

Theseus is made possible by the following contributors:

Made with contrib.rocks.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].