All Projects → JuliaDiff → Reversediff.jl

JuliaDiff / Reversediff.jl

Licence: other
Reverse Mode Automatic Differentiation for Julia

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Reversediff.jl

Forwarddiff.jl
Forward Mode Automatic Differentiation for Julia
Stars: ✭ 466 (+156.04%)
Mutual labels:  automatic-differentiation, calculus
Computator.net
Computator.NET is a special kind of numerical software that is fast and easy to use but not worse than others feature-wise. It's features include: - Real and complex functions charts - Real and complex calculator - Real functions numerical calculations including different methods - Over 107 Elementary functions - Over 141 Special functions - Over 21 Matrix functions and operations - Scripting language with power to easy computations including matrices - You can declare your own custom functions with scripting language
Stars: ✭ 174 (-4.4%)
Mutual labels:  calculus
Hackermath
Introduction to Statistics and Basics of Mathematics for Data Science - The Hacker's Way
Stars: ✭ 1,380 (+658.24%)
Mutual labels:  calculus
Aesara
Aesara is a fork of the Theano library that is maintained by the PyMC developers. It was previously named Theano-PyMC.
Stars: ✭ 145 (-20.33%)
Mutual labels:  automatic-differentiation
Adcme.jl
Automatic Differentiation Library for Computational and Mathematical Engineering
Stars: ✭ 106 (-41.76%)
Mutual labels:  automatic-differentiation
Backprop
Heterogeneous automatic differentiation ("backpropagation") in Haskell
Stars: ✭ 154 (-15.38%)
Mutual labels:  automatic-differentiation
Cppadcodegen
Source Code Generation for Automatic Differentiation using Operator Overloading
Stars: ✭ 77 (-57.69%)
Mutual labels:  automatic-differentiation
Data Science Masters
Self-study plan to achieve mastery in data science
Stars: ✭ 179 (-1.65%)
Mutual labels:  calculus
Symja android library
☕️ Symja - computer algebra language & symbolic math library. A collection of popular algorithms implemented in pure Java.
Stars: ✭ 170 (-6.59%)
Mutual labels:  calculus
Autograd.jl
Julia port of the Python autograd package.
Stars: ✭ 147 (-19.23%)
Mutual labels:  automatic-differentiation
Dcpp
Automatic differentiation in C++; infinite differentiability of conditionals, loops, recursion and all things C++
Stars: ✭ 143 (-21.43%)
Mutual labels:  automatic-differentiation
Stanford Cme 102 Ordinary Differential Equations
VIP cheatsheets for Stanford's CME 102 Ordinary Differential Equations for Engineers
Stars: ✭ 109 (-40.11%)
Mutual labels:  calculus
Galacticoptim.jl
Local, global, and beyond optimization for scientific machine learning (SciML)
Stars: ✭ 155 (-14.84%)
Mutual labels:  automatic-differentiation
Lazysets.jl
A Julia package for calculus with convex sets
Stars: ✭ 107 (-41.21%)
Mutual labels:  calculus
Qml
Introductions to key concepts in quantum machine learning, as well as tutorials and implementations from cutting-edge QML research.
Stars: ✭ 174 (-4.4%)
Mutual labels:  automatic-differentiation
Enzyme.jl
Julia bindings for the Enzyme automatic differentiator
Stars: ✭ 90 (-50.55%)
Mutual labels:  automatic-differentiation
Calc4b Zh
📖 [译] MIT 18.03 面向初学者的微积分
Stars: ✭ 114 (-37.36%)
Mutual labels:  calculus
Taylorseries.jl
A julia package for Taylor polynomial expansions in one and several independent variables.
Stars: ✭ 151 (-17.03%)
Mutual labels:  automatic-differentiation
Quant Notes
Quantitative Interview Preparation Guide, updated version here ==>
Stars: ✭ 180 (-1.1%)
Mutual labels:  calculus
Mitgcm
M.I.T General Circulation Model master code and documentation repository
Stars: ✭ 177 (-2.75%)
Mutual labels:  automatic-differentiation

ReverseDiff

Build status codecov.io

Go To ReverseDiff's Documentation

See ReverseDiff Usage Examples

ReverseDiff is a fast and compile-able tape-based reverse mode automatic differentiation (AD) that implements methods to take gradients, Jacobians, Hessians, and higher-order derivatives of native Julia functions (or any callable object, really).

While performance can vary depending on the functions you evaluate, the algorithms implemented by ReverseDiff generally outperform non-AD algorithms in both speed and accuracy.

Wikipedia's entry on automatic differentiation is a useful resource for learning about the advantages of AD techniques over other common differentiation methods (such as finite differencing).

Installation

To install ReverseDiff, simply use Julia's package manager:

julia> Pkg.add("ReverseDiff")

Why use ReverseDiff?

Other Julia packages may provide some of these features, but only ReverseDiff provides all of them (as far as I know at the time of this writing):

  • supports a large subset of the Julia language, including loops, recursion, and control flow
  • user-friendly API for reusing and compiling tapes
  • user-friendly performance annotations such as @forward and @skip (with more to come!)
  • compatible with ForwardDiff, enabling mixed-mode AD
  • built-in definitions leverage the benefits of ForwardDiff's Dual numbers (e.g. SIMD, zero-overhead arithmetic)
  • a familiar differentiation API for ForwardDiff users
  • non-allocating linear algebra optimizations
  • nested differentiation
  • suitable as an execution backend for graphical machine learning libraries
  • ReverseDiff doesn't need to record scalar indexing operations (a huge cost for many similar libraries)
  • higher-order map and broadcast optimizations
  • it's well tested

...and, simply put, it's fast (for gradients, at least). Using the code from examples/gradient.jl:

julia> using BenchmarkTools, Pkg

# this script defines f and ∇f!
julia> include(joinpath(Pkg.dir("ReverseDiff"), "examples/gradient.jl"));

julia> a, b = rand(100, 100), rand(100, 100);

julia> inputs = (a, b);

julia> results = (similar(a), similar(b));

# Benchmark the original objective function, sum(a' * b + a * b')
julia> @benchmark f($a, $b)
BenchmarkTools.Trial:
  memory estimate:  234.61 kb
  allocs estimate:  6
  --------------
  minimum time:     110.000 μs (0.00% GC)
  median time:      137.416 μs (0.00% GC)
  mean time:        173.085 μs (11.63% GC)
  maximum time:     3.613 ms (91.47% GC)

# Benchmark ∇f! at the same inputs (this is executing the function,
# getting the gradient w.r.t. `a`, and getting the gradient w.r.t
# to `b` simultaneously). Notice that the whole thing is
# non-allocating.
julia> @benchmark ∇f!($results, $inputs)
BenchmarkTools.Trial:
  memory estimate:  0.00 bytes
  allocs estimate:  0
  --------------
  minimum time:     429.650 μs (0.00% GC)
  median time:      431.460 μs (0.00% GC)
  mean time:        469.916 μs (0.00% GC)
  maximum time:     937.512 μs (0.00% GC)

I've used this benchmark (and others) to pit ReverseDiff against every other native Julia reverse-mode AD package that I know of (including source-to-source packages), and have found ReverseDiff to be faster and use less memory in most cases.

Should I use ReverseDiff or ForwardDiff?

ForwardDiff is algorithmically more efficient for differentiating functions where the input dimension is less than the output dimension, while ReverseDiff is algorithmically more efficient for differentiating functions where the output dimension is less than the input dimension.

Thus, ReverseDiff is generally a better choice for gradients, but Jacobians and Hessians are trickier to determine. For example, optimized methods for computing nested derivatives might use a combination of forward-mode and reverse-mode AD.

ForwardDiff is often faster than ReverseDiff for lower dimensional gradients (length(input) < 100), or gradients of functions where the number of input parameters is small compared to the number of operations performed on them. ReverseDiff is often faster if your code is expressed as a series of array operations, e.g. a composition of Julia's Base linear algebra methods.

In general, your choice of algorithms will depend on the function being differentiated, and you should benchmark different methods to see how they fare.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].