All Projects → under-Peter → Omeinsum.jl

under-Peter / Omeinsum.jl

Licence: mit
One More Einsum for Julia! With runtime order-specification and high-level adjoints for AD

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Omeinsum.jl

Gorgonia
Gorgonia is a library that helps facilitate machine learning in Go.
Stars: ✭ 4,295 (+5865.28%)
Mutual labels:  automatic-differentiation
Deeplearning.scala
A simple library for creating complex neural networks
Stars: ✭ 745 (+934.72%)
Mutual labels:  automatic-differentiation
Jax Fenics Adjoint
Differentiable interface to FEniCS for JAX using dolfin-adjoint/pyadjoint
Stars: ✭ 32 (-55.56%)
Mutual labels:  automatic-differentiation
Pinocchio
A fast and flexible implementation of Rigid Body Dynamics algorithms and their analytical derivatives
Stars: ✭ 432 (+500%)
Mutual labels:  automatic-differentiation
Control Toolbox
The Control Toolbox - An Open-Source C++ Library for Robotics, Optimal and Model Predictive Control
Stars: ✭ 562 (+680.56%)
Mutual labels:  automatic-differentiation
Pennylane
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
Stars: ✭ 800 (+1011.11%)
Mutual labels:  automatic-differentiation
Deep Learning From Scratch 3
『ゼロから作る Deep Learning ❸』(O'Reilly Japan, 2020)
Stars: ✭ 380 (+427.78%)
Mutual labels:  automatic-differentiation
Quantumflow Dev
QuantumFlow: A Quantum Algorithms Development Toolkit
Stars: ✭ 43 (-40.28%)
Mutual labels:  automatic-differentiation
Autodiff
automatic differentiation made easier for C++
Stars: ✭ 641 (+790.28%)
Mutual labels:  automatic-differentiation
Spago
Self-contained Machine Learning and Natural Language Processing library in Go
Stars: ✭ 854 (+1086.11%)
Mutual labels:  automatic-differentiation
Deepdarkfantasy
A Programming Language for Deep Learning
Stars: ✭ 463 (+543.06%)
Mutual labels:  automatic-differentiation
Math
The Stan Math Library is a C++ template library for automatic differentiation of any order using forward, reverse, and mixed modes. It includes a range of built-in functions for probabilistic modeling, linear algebra, and equation solving.
Stars: ✭ 494 (+586.11%)
Mutual labels:  automatic-differentiation
Owl
Owl - OCaml Scientific and Engineering Computing @ http://ocaml.xyz
Stars: ✭ 919 (+1176.39%)
Mutual labels:  automatic-differentiation
Enzyme
High-performance automatic differentiation of LLVM.
Stars: ✭ 418 (+480.56%)
Mutual labels:  automatic-differentiation
Autoppl
C++ template library for probabilistic programming
Stars: ✭ 34 (-52.78%)
Mutual labels:  automatic-differentiation
Kotlingrad
Shape-Safe Symbolic Differentiation with Algebraic Data Types
Stars: ✭ 388 (+438.89%)
Mutual labels:  automatic-differentiation
Arraymancer
A fast, ergonomic and portable tensor library in Nim with a deep learning focus for CPU, GPU and embedded devices via OpenMP, Cuda and OpenCL backends
Stars: ✭ 793 (+1001.39%)
Mutual labels:  automatic-differentiation
Tensornetworkad.jl
Algorithms that combine tensor network methods with automatic differentiation
Stars: ✭ 54 (-25%)
Mutual labels:  automatic-differentiation
Qualia2.0
Qualia is a deep learning framework deeply integrated with automatic differentiation and dynamic graphing with CUDA acceleration. Qualia was built from scratch.
Stars: ✭ 41 (-43.06%)
Mutual labels:  automatic-differentiation
Zygote.jl
Intimate Affection Auditor
Stars: ✭ 933 (+1195.83%)
Mutual labels:  automatic-differentiation
OMEinsum logo

OMEinsum - One More Einsum

Stable Dev Build Status pipeline status Codecov

This is a repository for the Google Summer of Code project on Differentiable Tensor Networks. It implements one function that both computer scientists and physicists love, the Einstein summation

einsum definition

To find out the details about einsum, please check out my nextjournal-article or the numpy-manual.

Einstein summation can be implemented in no more than 20 lines of Julia code, the automatic differentiation is also straightforward. The main effort of this package is improving the performance utilizing Julia multiple dispatch on traits. So that people can enjoy the speed of faster specific implementations like BLAS functions, sum and permutedims on both CPU and GPU without suffering from runtime overhead.

Note: why the test coverage is not 100% - GPU-code coverage is not evaluated although we test the GPU code properly on gitlab. Ignoring the GPU-code, the actual coverage is at about 97%.

Install

To install, type ] in a julia (>=1.5) REPL and then input

pkg> add OMEinsum

Learn by Examples

To avoid runtime overhead, we recommend users to use non-standard string literal @ein_str. The following examples illustrates how einsum works

julia> using OMEinsum, SymEngine

julia> catty = fill(Basic(:🐱), 2, 2)
2×2 Array{Basic,2}:
 🐱  🐱
 🐱  🐱

julia> fish = fill(Basic(:🐟), 2, 3, 2)
2×3×2 Array{Basic,3}:
[:, :, 1] =
 🐟  🐟  🐟
 🐟  🐟  🐟

[:, :, 2] =
 🐟  🐟  🐟
 🐟  🐟  🐟

julia> snake = fill(Basic(:🐍), 3, 3)
3×3 Array{Basic,2}:
 🐍  🐍  🐍
 🐍  🐍  🐍
 🐍  🐍  🐍

julia> medicine = ein"ij,jki,kk->k"(catty, fish, snake)
3-element Array{Basic,1}:
 4*🐱*🐍*🐟
 4*🐱*🐍*🐟
 4*🐱*🐍*🐟

julia> ein"ik,kj -> ij"(catty, catty) # multiply two matrices `a` and `b`
2×2 Array{Basic,2}:
 2*🐱^2  2*🐱^2
 2*🐱^2  2*🐱^2

julia> ein"ij -> "(catty)[] # sum a matrix, output 0-dimensional array
4*🐱

julia> ein"->ii"(asarray(snake[1,1]), size_info=IndexSize('i'=>5)) # get 5 x 5 identity matrix
5×5 Array{Basic,2}:
 🐍  0  0  0  0
 0  🐍  0  0  0
 0  0  🐍  0  0
 0  0  0  🐍  0
 0  0  0  0  🐍

Alternatively, people can specify the contraction with a construction approach, which is useful when the contraction code can only be obtained at run time

julia> einsum(EinCode((('i','k'),('k','j')),('i','j')),(a,b))

or a macro based interface, @ein macro, which is closer to the standard way of writing einsum-operations in physics

julia> @ein c[i,j] := a[i,k] * b[k,j];

A table for reference

code meaning
ein"ij,jk->ik" matrix matrix multiplication
ein"ijl,jkl->ikl" batched - matrix matrix multiplication
ein"ij,j->i" matrix vector multiplication
ein"ij,ik,il->jkl" star contraction
ein"ii->" trace
ein"ij->i" sum
ein"ii->i" take the diagonal part of a matrix
ein"ijkl->ilkj" permute the dimensions of a tensor
ein"i->ii" construct a diagonal matrix
ein"->ii" broadcast a scalar to the diagonal part of a matrix
ein"ij,ij->ij" element wise product
ein"ij,kl->ijkl" outer product

Many of these are handled by special kernels (listed in the docs), but there is also a fallback which handles other cases (more like what Einsum.jl does, plus a GPU version).

It is sometimes helpful to specify the order of operations, by inserting brackets, either because you know this will be more efficient, or to help the computer see what kernels can be used. For example:

julia> @ein Z[o,s] := x[i,s] * (W[o,i,j] * y[j,s]);   # macro style

julia> Z = ein"is, (oij, js) -> os"(x, W, y);         # string style

This performs matrix multiplication (summing over j) followed by batched matrix multiplication (summing over i, batch label s). Without the brackets, instead it uses the fallback loop_einsum, which is slower. Calling allow_loops(false) will print an error to help you spot such cases:

julia> @ein Zl[o,s] := x[i,s] * W[o,i,j] * y[j,s];

julia> Z  Zl
true

julia> allow_loops(false);

julia> Zl = ein"is, oij, js -> os"(x, W, y);
 Error: using `loop_einsum` to evaluate
   code = EinCode{((1, 2), (3, 1, 4), (4, 2)),(3, 2)}()
   size.(xs) = ((10, 50), (20, 10, 10), (10, 50))
   size(y) = (20, 50)
 @ OMEinsum ~/.julia/dev/OMEinsum/src/loop_einsum.jl:26

To see more examples using the GPU and autodiff, check out our asciinema-demo here: asciicast

Application

For an application in tensor network algorithms, check out the TensorNetworkAD package, where OMEinsum is used to evaluate tensor-contractions, permutations and summations.

Toy Application: solving a 3-coloring problem on the Petersen graph

Let us focus on graphs with vertices with three edges each. A question one might ask is: How many different ways are there to colour the edges of the graph with three different colours such that no vertex has a duplicate colour on its edges?

The counting problem can be transformed into a contraction of rank-3 tensors representing the edges. Consider the tensor s defined as

julia> s = map(x->Int(length(unique(x.I)) == 3), CartesianIndices((3,3,3)))

Then we can simply contract s tensors to get the number of 3 colourings satisfying the above condition! E.g. for two vertices, we get 6 distinct colourings:

julia> ein"ijk,ijk->"(s,s)[]
6

Using that method, it's easy to find that e.g. the peterson graph allows no 3 colouring, since

julia> ein"afl,bhn,cjf,dlh,enj,ago,big,cki,dmk,eom->"(fill(s, 10)...)[]
0

The peterson graph consists of 10 vertices and 15 edges and looks like a pentagram embedded in a pentagon as depicted here:

Confronted with the above result, we can ask whether the peterson graph allows a relaxed variation of 3 colouring, having one vertex that might accept duplicate colours. The answer to that can be found using the gradient w.r.t a vertex:

julia> using Zygote: gradient

julia> gradient(x->ein"afl,bhn,cjf,dlh,enj,ago,big,cki,dmk,eom->"(x,s,s,s,s,s,s,s,s,s)[], s)[1] |> sum
0

This tells us that even if we allow duplicates on one vertex, there are no 3-colourings for the peterson graph.

Contribute

Suggestions and Comments in the Issues are welcome.

License

MIT License

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].