All Projects → JuliaDiff → Taylorseries.jl

JuliaDiff / Taylorseries.jl

Licence: other
A julia package for Taylor polynomial expansions in one and several independent variables.

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Taylorseries.jl

Autodiff
automatic differentiation made easier for C++
Stars: ✭ 641 (+324.5%)
Mutual labels:  automatic-differentiation
Autoppl
C++ template library for probabilistic programming
Stars: ✭ 34 (-77.48%)
Mutual labels:  automatic-differentiation
Enzyme.jl
Julia bindings for the Enzyme automatic differentiator
Stars: ✭ 90 (-40.4%)
Mutual labels:  automatic-differentiation
Arraymancer
A fast, ergonomic and portable tensor library in Nim with a deep learning focus for CPU, GPU and embedded devices via OpenMP, Cuda and OpenCL backends
Stars: ✭ 793 (+425.17%)
Mutual labels:  automatic-differentiation
Spago
Self-contained Machine Learning and Natural Language Processing library in Go
Stars: ✭ 854 (+465.56%)
Mutual labels:  automatic-differentiation
Quantumflow Dev
QuantumFlow: A Quantum Algorithms Development Toolkit
Stars: ✭ 43 (-71.52%)
Mutual labels:  automatic-differentiation
Math
The Stan Math Library is a C++ template library for automatic differentiation of any order using forward, reverse, and mixed modes. It includes a range of built-in functions for probabilistic modeling, linear algebra, and equation solving.
Stars: ✭ 494 (+227.15%)
Mutual labels:  automatic-differentiation
Autograd.jl
Julia port of the Python autograd package.
Stars: ✭ 147 (-2.65%)
Mutual labels:  automatic-differentiation
Jax Fenics Adjoint
Differentiable interface to FEniCS for JAX using dolfin-adjoint/pyadjoint
Stars: ✭ 32 (-78.81%)
Mutual labels:  automatic-differentiation
Cppadcodegen
Source Code Generation for Automatic Differentiation using Operator Overloading
Stars: ✭ 77 (-49.01%)
Mutual labels:  automatic-differentiation
Pennylane
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
Stars: ✭ 800 (+429.8%)
Mutual labels:  automatic-differentiation
Zygote.jl
Intimate Affection Auditor
Stars: ✭ 933 (+517.88%)
Mutual labels:  automatic-differentiation
Tensornetworkad.jl
Algorithms that combine tensor network methods with automatic differentiation
Stars: ✭ 54 (-64.24%)
Mutual labels:  automatic-differentiation
Deeplearning.scala
A simple library for creating complex neural networks
Stars: ✭ 745 (+393.38%)
Mutual labels:  automatic-differentiation
Adcme.jl
Automatic Differentiation Library for Computational and Mathematical Engineering
Stars: ✭ 106 (-29.8%)
Mutual labels:  automatic-differentiation
Control Toolbox
The Control Toolbox - An Open-Source C++ Library for Robotics, Optimal and Model Predictive Control
Stars: ✭ 562 (+272.19%)
Mutual labels:  automatic-differentiation
Qualia2.0
Qualia is a deep learning framework deeply integrated with automatic differentiation and dynamic graphing with CUDA acceleration. Qualia was built from scratch.
Stars: ✭ 41 (-72.85%)
Mutual labels:  automatic-differentiation
Aesara
Aesara is a fork of the Theano library that is maintained by the PyMC developers. It was previously named Theano-PyMC.
Stars: ✭ 145 (-3.97%)
Mutual labels:  automatic-differentiation
Dcpp
Automatic differentiation in C++; infinite differentiability of conditionals, loops, recursion and all things C++
Stars: ✭ 143 (-5.3%)
Mutual labels:  automatic-differentiation
Omeinsum.jl
One More Einsum for Julia! With runtime order-specification and high-level adjoints for AD
Stars: ✭ 72 (-52.32%)
Mutual labels:  automatic-differentiation

TaylorSeries.jl

A Julia package for Taylor polynomial expansions in one or more independent variables.

CI Coverage Status

DOI DOI

Authors

  • Luis Benet, Instituto de Ciencias Físicas, Universidad Nacional Autónoma de México (UNAM)
  • David P. Sanders, Facultad de Ciencias, Universidad Nacional Autónoma de México (UNAM)

Comments, suggestions and improvements are welcome and appreciated.

Examples

Taylor series in one varaible

julia> using TaylorSeries

julia> t = Taylor1(Float64, 5)
 1.0 t + 𝒪(t⁶)

julia> exp(t)
 1.0 + 1.0 t + 0.5  + 0.16666666666666666  + 0.041666666666666664 t⁴ + 0.008333333333333333 t⁵ + 𝒪(t⁶)
 
 julia> log(1 + t)
 1.0 t - 0.5  + 0.3333333333333333  - 0.25 t⁴ + 0.2 t⁵ + 𝒪(t⁶)

Multivariate Taylor series

julia> x, y = set_variables("x y", order=2);

julia> exp(x + y)
1.0 + 1.0 x + 1.0 y + 0.5  + 1.0 x y + 0.5  + 𝒪(‖x‖³)

Differential and integral calculus on Taylor series:

julia> x, y = set_variables("x y", order=4);

julia> p = x^3 + 2x^2 * y - 7x + 2
 2.0 - 7.0 x + 1.0  + 2.0  y + 𝒪(‖x‖⁵)

julia> (p)
2-element Array{TaylorN{Float64},1}:
  - 7.0 + 3.0  + 4.0 x y + 𝒪(‖x‖⁵)
                    2.0  + 𝒪(‖x‖⁵)

julia> integrate(p, 1)
 2.0 x - 3.5  + 0.25 x⁴ + 0.6666666666666666  y + 𝒪(‖x‖⁵)

julia> integrate(p, 2)
 2.0 y - 7.0 x y + 1.0  y + 1.0   + 𝒪(‖x‖⁵)

For more details, please see the docs.

License

TaylorSeries is licensed under the MIT "Expat" license.

Installation

TaylorSeries can be installed simply with using Pkg; Pkg.add("TaylorSeries").

Contributing

There are many ways to contribute to this package:

  • Report an issue if you encounter some odd behavior, or if you have suggestions to improve the package.
  • Contribute with code addressing some open issues, that add new functionality or that improve the performance.
  • When contributing with code, add docstrings and comments, so others may understand the methods implemented.
  • Contribute by updating and improving the documentation.

References

  • W. Tucker, Validated numerics: A short introduction to rigorous computations, Princeton University Press (2011).
  • A. Haro, Automatic differentiation methods in computational dynamical systems: Invariant manifolds and normal forms of vector fields at fixed points, preprint.

Acknowledgments

This project began (using python) during a Masters' course in the postgraduate programs in Physics and in Mathematics at UNAM, during the second half of 2013. We thank the participants of the course for putting up with the half-baked material and contributing energy and ideas.

We acknowledge financial support from DGAPA-UNAM PAPIME grants PE-105911 and PE-107114, and DGAPA-PAPIIT grants IG-101113 and IG-100616. LB acknowledges support through a Cátedra Moshinsky (2013).

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].