All Projects → ZigaSajovic → Dcpp

ZigaSajovic / Dcpp

Automatic differentiation in C++; infinite differentiability of conditionals, loops, recursion and all things C++

Projects that are alternatives of or similar to Dcpp

Forwarddiff.jl
Forward Mode Automatic Differentiation for Julia
Stars: ✭ 466 (+225.87%)
Mutual labels:  automatic-differentiation
Zygote.jl
Intimate Affection Auditor
Stars: ✭ 933 (+552.45%)
Mutual labels:  automatic-differentiation
Tensornetworkad.jl
Algorithms that combine tensor network methods with automatic differentiation
Stars: ✭ 54 (-62.24%)
Mutual labels:  automatic-differentiation
Control Toolbox
The Control Toolbox - An Open-Source C++ Library for Robotics, Optimal and Model Predictive Control
Stars: ✭ 562 (+293.01%)
Mutual labels:  automatic-differentiation
Pennylane
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
Stars: ✭ 800 (+459.44%)
Mutual labels:  automatic-differentiation
Jax Fenics Adjoint
Differentiable interface to FEniCS for JAX using dolfin-adjoint/pyadjoint
Stars: ✭ 32 (-77.62%)
Mutual labels:  automatic-differentiation
Pinocchio
A fast and flexible implementation of Rigid Body Dynamics algorithms and their analytical derivatives
Stars: ✭ 432 (+202.1%)
Mutual labels:  automatic-differentiation
Enzyme.jl
Julia bindings for the Enzyme automatic differentiator
Stars: ✭ 90 (-37.06%)
Mutual labels:  automatic-differentiation
Owl
Owl - OCaml Scientific and Engineering Computing @ http://ocaml.xyz
Stars: ✭ 919 (+542.66%)
Mutual labels:  automatic-differentiation
Quantumflow Dev
QuantumFlow: A Quantum Algorithms Development Toolkit
Stars: ✭ 43 (-69.93%)
Mutual labels:  automatic-differentiation
Autodiff
automatic differentiation made easier for C++
Stars: ✭ 641 (+348.25%)
Mutual labels:  automatic-differentiation
Arraymancer
A fast, ergonomic and portable tensor library in Nim with a deep learning focus for CPU, GPU and embedded devices via OpenMP, Cuda and OpenCL backends
Stars: ✭ 793 (+454.55%)
Mutual labels:  automatic-differentiation
Autoppl
C++ template library for probabilistic programming
Stars: ✭ 34 (-76.22%)
Mutual labels:  automatic-differentiation
Math
The Stan Math Library is a C++ template library for automatic differentiation of any order using forward, reverse, and mixed modes. It includes a range of built-in functions for probabilistic modeling, linear algebra, and equation solving.
Stars: ✭ 494 (+245.45%)
Mutual labels:  automatic-differentiation
Omeinsum.jl
One More Einsum for Julia! With runtime order-specification and high-level adjoints for AD
Stars: ✭ 72 (-49.65%)
Mutual labels:  automatic-differentiation
Deepdarkfantasy
A Programming Language for Deep Learning
Stars: ✭ 463 (+223.78%)
Mutual labels:  automatic-differentiation
Spago
Self-contained Machine Learning and Natural Language Processing library in Go
Stars: ✭ 854 (+497.2%)
Mutual labels:  automatic-differentiation
Adcme.jl
Automatic Differentiation Library for Computational and Mathematical Engineering
Stars: ✭ 106 (-25.87%)
Mutual labels:  automatic-differentiation
Cppadcodegen
Source Code Generation for Automatic Differentiation using Operator Overloading
Stars: ✭ 77 (-46.15%)
Mutual labels:  automatic-differentiation
Qualia2.0
Qualia is a deep learning framework deeply integrated with automatic differentiation and dynamic graphing with CUDA acceleration. Qualia was built from scratch.
Stars: ✭ 41 (-71.33%)
Mutual labels:  automatic-differentiation



dCppv0.1.0

dCpp

Automatic differentiation in C++; infinite differentiability of conditionals, loops, recursion and all things C++

dCpp is a tool for automatic differentiation made to be intuitive to the mind of a C++ programmer and non-invasive to his process. Despite its easy to use nature it retains flexibility, allowing implementations of differentiable (sub) programs operating on differentiable derivatives of other (sub) programs, where the entire process may again be differentiable. This allows trainable training processes, and flexible program analysis through operational calculus.

dCpp was originally developed as an example of how automatic differentiation can be viewed through Tensor and Operational Calculus. It has since been applied to a variety of tasks from dynamical systems analysis and digital geometry, to general program analysis and optimization by various parties.

Note that this was the authors first c++ project, which is reflected in the repository :).

Tutorial

We demonstrate the utilities of dCpp on a simple encapsulating example.

First we include the necessities

#include <iostream>
#include <dCpp.h>

We initialize an n-differentiable programming space

using namespace dCpp;
int n_differentiable = 3;
dCpp::initSpace(n_differentiable);

The Basics

The API of var complies with those for the standard C++ types, and when an instance of var is left uninitialized it behaves as the type double would have. We may envision an instance of var as an element of the differentiable virtual memory algebra, elevating C++ to a differentiable programming space dCpp. This means that any program can be made differentiable by simply substituting the type double for type var and that the coding process of the user can be left unchanged towards the initially intended goal.

By coding a simple recursive function foo we see that the usual usage of constructs such as conditionals, loops and recursion remains unchanged.

var foo(const var& x, const var& y)
{
    if(x < 1)
        return y;
    else if(y < 1)
        return x;
    else
        return x / foo(x / 2, y) + y * foo(x, y / 3);
}

To test it, we declare two instances of var.

var x=10;
var y=13;

Variables with respect to which differentiation is to be performed need to be initialized as such. This assures that uninitialized instances behave as the type double does. With the difference that all instances of var are differentiable with respect to all initialized instances.

dCpp::init(x);
dCpp::init(y);

The derivatives are extracted by specifying the memory location of the variable with respect to which differentiation is to be performed.

var f = foo(x,y);
std::cout <<  f << std::endl;
std::cout <<  f.d(&x) << std::endl;
std::cout << f.d(&y) << std::endl;

884.998
82.1202
193.959

Differentiable derivatives

The virtual memory space is constructed through tensor products of C++ internal representation of the memory. This means that derivatives are themselves elements of the differentiable virtual memory.

var fx = f.d(&x);
std::cout <<  fx.d(&x) << std::endl;
std::cout <<  fx.d(&y) << std::endl;
var fy = f.d(&y);
std::cout <<  fy.d(&x) << std::endl;
std::cout <<  fy.d(&y) << std::endl;

-0.103319
18.7722
18.7722
28.8913

We can thus employ derivatives of f in further n-1-differentiable calculations.

var F = dCpp::sqrt((fx^2) + (fy^2));
std::cout <<  F << std::endl;
std::cout <<  F.d(&x) << std::endl;
std::cout <<  F.d(&y) << std::endl;

210.627
17.2464
33.9239

As the derivatives of f are n-1-differentiable (twice, in our case), we can interweave them in calculations containing f itself.

var t = dCpp::sqrt(((fx^2) + (fy^2)) / f);
std::cout <<  t << std::endl;
std::cout <<  t.d(&x) << std::endl;
std::cout <<  t.d(&y) << std::endl;

7.08016
0.251241
0.364486

This is particularly useful when analyzing and optimizing differential equations, where usually both f and its (higher) derivatives appear in the same expression.

A note on the order of differentiability

The order of an expression is that of the lowest order of the expressions appearing in its construction.

Expression Order
f 3
fx = f.d(&x) 2
fy = f.d(&y) 2
(fx^2 + fy^2) / f 2
fxx = fx.d(&x) 1
fxy = fx.d(&y) 1
f * (fxy + fxx) / (fx - fy) 1

This means, that when we want to perform some non-differentiable operation on an expression, such as incrementing a variable in a gradient descent iteration, we should extract the value of its derivative using the id attribute of the instance var.

double lambda = 0.1;
double fx_double = f.d(&x).id
x += lambda * fx_double;
double fy_double = f.d(&x).id
y += lambda * fy_double

An example of a gradient descent can be found in examples/barycenterGD with a detailed explanation available in the closed issue here.

Operator dTau

If a certain mapping the user desires is not provided in the dCpp namespace, but its derivative exists, he may create the desired map by employing the operator tau.

Lets assume that the map log is not provided and create it using tau, by providing it with two maps, log: double --> double and log_primitive: var --> var.

var log_primitive(const var& v)
{   
   return 1 / v;
}

tau log(std::log, log_primitive);

The map is now ready to use

var l=log(((x^2) - (y^0.23))^2.1);
std::cout <<  l << std::endl;
std::cout <<  l.d(&x) << std::endl;
std::cout <<  l.d(&y) << std::endl;

9.63263
0.427715
-0.000682522

Examples

Further reading

As the presenting tutorial is quite brief, please consult the discussions regarding common mistakes and solutions.

Or consult the concerning papers.

Citation

If you use dCpp in your work, please cite one of the following papers

Žiga Sajovic, et al.: Operational Calculus for Differentiable Programming. arXiv e-prints arXiv:1610.07690 (2016)

@article{
    Author = {Žiga Sajovic, et al.},
    Title = {Operational Calculus for Differentiable Programming},
    journal = {arXiv e-prints},
    Year = 2016,
    volume = {arXiv:1610.07690},
    Eprint = {1610.07690},
    Eprinttype = {arXiv},
}

Žiga Sajovic: Automatic Differentiation: a look through Tensor and Operational Calculus. arXiv e-prints arXiv:1612.02731 (2016)

@article{
    Author = {Žiga Sajovic},
    Title = {Automatic Differentiation: a look through Tensor and Operational Calculus},
    journal = {arXiv e-prints},
    Year = 2016,
    volume = {arXiv:1612.02731},
    Eprint = {1612.02731},
    Eprinttype = {arXiv},
}

Creative Commons License
dC++ by Žiga Sajovic is licensed under a Creative Commons Attribution 4.0 International License.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].