All Projects → Non-Contradiction → autodiffr

Non-Contradiction / autodiffr

Licence: other
Automatic Differentiation for R

Programming Languages

r
7636 projects
julia
2034 projects

Projects that are alternatives of or similar to autodiffr

Mitgcm
M.I.T General Circulation Model master code and documentation repository
Stars: ✭ 177 (+742.86%)
Mutual labels:  automatic-differentiation
AdFem.jl
Innovative, efficient, and computational-graph-based finite element simulator for inverse modeling
Stars: ✭ 62 (+195.24%)
Mutual labels:  automatic-differentiation
Fortran-Tools
Fortran compilers, preprocessors, static analyzers, transpilers, IDEs, build systems, etc.
Stars: ✭ 31 (+47.62%)
Mutual labels:  automatic-differentiation
Tangent
Source-to-Source Debuggable Derivatives in Pure Python
Stars: ✭ 2,209 (+10419.05%)
Mutual labels:  automatic-differentiation
admc
Infinite order automatic differentiation for Monte Carlo with unnormalized probability distribution
Stars: ✭ 17 (-19.05%)
Mutual labels:  automatic-differentiation
MultiScaleArrays.jl
A framework for developing multi-scale arrays for use in scientific machine learning (SciML) simulations
Stars: ✭ 63 (+200%)
Mutual labels:  automatic-differentiation
Chainrules.jl
forward and reverse mode automatic differentiation primitives for Julia Base + StdLibs
Stars: ✭ 162 (+671.43%)
Mutual labels:  automatic-differentiation
Scientific-Programming-in-Julia
Repository for B0M36SPJ
Stars: ✭ 32 (+52.38%)
Mutual labels:  automatic-differentiation
omd
JAX code for the paper "Control-Oriented Model-Based Reinforcement Learning with Implicit Differentiation"
Stars: ✭ 43 (+104.76%)
Mutual labels:  automatic-differentiation
cgdms
Differentiable molecular simulation of proteins with a coarse-grained potential
Stars: ✭ 44 (+109.52%)
Mutual labels:  automatic-differentiation
Aerosandbox
Aircraft design optimization made fast through modern automatic differentiation. Plug-and-play analysis tools for aerodynamics, propulsion, structures, trajectory design, and much, much more.
Stars: ✭ 193 (+819.05%)
Mutual labels:  automatic-differentiation
MASA
Method of Manufactured Solutions Repository
Stars: ✭ 46 (+119.05%)
Mutual labels:  automatic-differentiation
Causing
Causing: CAUsal INterpretation using Graphs
Stars: ✭ 47 (+123.81%)
Mutual labels:  automatic-differentiation
Reversediff.jl
Reverse Mode Automatic Differentiation for Julia
Stars: ✭ 182 (+766.67%)
Mutual labels:  automatic-differentiation
MissionImpossible
A concise C++17 implementation of automatic differentiation (operator overloading)
Stars: ✭ 18 (-14.29%)
Mutual labels:  automatic-differentiation
Qml
Introductions to key concepts in quantum machine learning, as well as tutorials and implementations from cutting-edge QML research.
Stars: ✭ 174 (+728.57%)
Mutual labels:  automatic-differentiation
ADAM
ADAM implements a collection of algorithms for calculating rigid-body dynamics in Jax, CasADi, PyTorch, and Numpy.
Stars: ✭ 51 (+142.86%)
Mutual labels:  automatic-differentiation
Tensorial.jl
Statically sized tensors and related operations for Julia
Stars: ✭ 18 (-14.29%)
Mutual labels:  automatic-differentiation
xcfun
XCFun: A library of exchange-correlation functionals with arbitrary-order derivatives
Stars: ✭ 50 (+138.1%)
Mutual labels:  automatic-differentiation
Tensors.jl
Efficient computations with symmetric and non-symmetric tensors with support for automatic differentiation.
Stars: ✭ 142 (+576.19%)
Mutual labels:  automatic-differentiation

autodiffr for Automatic Differentiation in R through Julia

Travis-CI Build Status AppVeyor Build Status

Package autodiffr provides an R wrapper for Julia packages ForwardDiff.jl and ReverseDiff.jl through JuliaCall to do automatic differentiation for native R functions.

Installation

Julia is needed to use autodiffr. You can download a generic Julia binary from https://julialang.org/downloads/ and add it to the path. Pakcage autodiffr is not on CRAN yet. You can get the development version of autodiffr by

devtools::install_github("Non-Contradiction/autodiffr")

Important: Note that currently Julia v0.6.x, v0.7.0 and v1.0 are all supported by autodiffr, but to use autodiffr with Julia v0.7/1.0, you need to get the development version of JuliaCall by:

devtools::install_github("Non-Contradiction/JuliaCall")

Basic Usage

library(autodiffr)

## Do initial setup

ad_setup()
#> Julia version 1.0.0 at location /Applications/Julia-1.0.app/Contents/Resources/julia/bin will be used.
#> Loading setup script for JuliaCall...
#> Finish loading setup script for JuliaCall.

## If you want to use a julia at a specific location, you could do the following:
## ad_setup(JULIA_HOME = "the folder that contains julia binary"), 
## or you can set JULIA_HOME in command line environment or use `options(...)`

## Define a target function with vector input and scalar output
f <- function(x) sum(x^2L)

## Calculate gradient of f at [2,3] by
ad_grad(f, c(2, 3)) ## deriv(f, c(2, 3))
#> [1] 4 6

## Get a gradient function g
g <- makeGradFunc(f)

## Evaluate the gradient function g at [2,3]
g(c(2, 3))
#> [1] 4 6

## Calculate hessian of f at [2,3] by
ad_hessian(f, c(2, 3))
#>      [,1] [,2]
#> [1,]    2    0
#> [2,]    0    2

## Get a hessian function h
h <- makeHessianFunc(f)

## Evaluate the hessian function h at [2,3]
h(c(2, 3))
#>      [,1] [,2]
#> [1,]    2    0
#> [2,]    0    2

## Define a target function with vector input and vector output
f <- function(x) x^2

## Calculate jacobian of f at [2,3] by
ad_jacobian(f, c(2, 3))
#>      [,1] [,2]
#> [1,]    4    0
#> [2,]    0    6

## Get a jacobian function j
j <- makeJacobianFunc(f)

## Evaluate the gradient function j at [2,3]
j(c(2, 3))
#>      [,1] [,2]
#> [1,]    4    0
#> [2,]    0    6

Advanced Usage

Functions with Multiple Arguments

## Define a target function with mulitple arguments
f <- function(a = 1, b = 2, c = 3) a * b ^ 2 * c ^ 3

## Calculate gradient/derivative of f at a = 2, when b = c = 1 by
ad_grad(f, 2, b = 1, c = 1) ## deriv(f, 2, b = 1, c = 1)
#> [1] 1

## Get a gradient/derivative function g w.r.t a when b = c = 1 by
g <- makeGradFunc(f, b = 1, c = 1)

## Evaluate the gradient/derivative function g at a = 2
g(2)
#> [1] 1

## Calculate gradient/derivative of f at a = 2, b = 3, when c = 1 by
ad_grad(f, list(a = 2, b = 3), c = 1)
#> $a
#> [1] 9
#> 
#> $b
#> [1] 12

## Get a gradient/derivative function g w.r.t a and b when c = 1 by
g <- makeGradFunc(f, c = 1)

## Evaluate the gradient/derivative function g at a = 2, b = 3
g(list(a = 2, b = 3))
#> $a
#> [1] 9
#> 
#> $b
#> [1] 12

Trouble Shooting and Way to Get Help

Julia is not found

Make sure the Julia installation is correct. autodiffr is able to find Julia on PATH, and there are three ways for autodiffr to find Julia not on PATH.

  • Use ad_setup(JULIA_HOME = "the folder that contains julia binary")
  • Use options(JULIA_HOME = "the folder that contains julia binary")
  • Set JULIA_HOME in command line environment.

How to Get Help

Suggestion and Issue Reporting

autodiffr is under active development now. Any suggestion or issue reporting is welcome! You may report it using the link: https://github.com/Non-Contradiction/autodiffr/issues/new. Or email me at [email protected] or [email protected].

Acknowledgement

The project autodiffr was a Google Summer of Code (GSoC) 2018 project for the “R Project for statistical computing” and with mentors John Nash and Hans W Borchers. Thanks a lot!

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].