All Projects → mcabbott → TensorCast.jl

mcabbott / TensorCast.jl

Licence: other
It slices, it dices, it splices!

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to TensorCast.jl

GenericTensor
The only library allowing to create Tensors (matrices extension) with custom types
Stars: ✭ 42 (-60.38%)
Mutual labels:  tensor
scikit tt
Tensor Train Toolbox
Stars: ✭ 52 (-50.94%)
Mutual labels:  tensor
dewdle
A remote video-feed drawing tool (telestrator) for streaming and broadcast environments.
Stars: ✭ 29 (-72.64%)
Mutual labels:  broadcasting
DI-treetensor
Let DI-treetensor help you simplify the structure processing!(树形运算一不小心就逻辑混乱?DI-treetensor快速帮你搞定)
Stars: ✭ 134 (+26.42%)
Mutual labels:  tensor
tensority
Strongly typed multidimensional array library for OCaml
Stars: ✭ 44 (-58.49%)
Mutual labels:  tensor
Tensorial.jl
Statically sized tensors and related operations for Julia
Stars: ✭ 18 (-83.02%)
Mutual labels:  tensor
Tensors.jl
Efficient computations with symmetric and non-symmetric tensors with support for automatic differentiation.
Stars: ✭ 142 (+33.96%)
Mutual labels:  tensor
mappool-generator
A Mappool Generator for osu! Tournament Livestreams
Stars: ✭ 20 (-81.13%)
Mutual labels:  broadcasting
deadsfu
Dead-simple WebRTC broadcasting. From the browser, or your application. Cloud-native and scalable.
Stars: ✭ 23 (-78.3%)
Mutual labels:  broadcasting
echo-server
Echo Server is a Docker-ready, multi-scalable Node.js application used to host your own Socket.IO server for Laravel Broadcasting.
Stars: ✭ 32 (-69.81%)
Mutual labels:  broadcasting
overscan
A live coding environment for live streaming video
Stars: ✭ 36 (-66.04%)
Mutual labels:  broadcasting
Tensor
A library and extension that provides objects for scientific computing in PHP.
Stars: ✭ 146 (+37.74%)
Mutual labels:  tensor
MatrixLib
Lightweight header-only matrix library (C++) for numerical optimization and machine learning. Contact me if there is an exciting opportunity.
Stars: ✭ 35 (-66.98%)
Mutual labels:  broadcasting
Xtensor.jl
Julia package for xtensor-julia
Stars: ✭ 38 (-64.15%)
Mutual labels:  tensor
NDScala
N-dimensional arrays in Scala 3. Think NumPy ndarray, but type-safe over shapes, array/axis labels & numeric data types
Stars: ✭ 37 (-65.09%)
Mutual labels:  tensor
pytorch-examples-cn
用例子学习PyTorch1.0(Learning PyTorch with Examples 中文翻译与学习)
Stars: ✭ 54 (-49.06%)
Mutual labels:  tensor
keras-neural-graph-fingerprint
Keras implementation of Neural Graph Fingerprints as proposed by Duvenaud et al., 2015
Stars: ✭ 47 (-55.66%)
Mutual labels:  tensor
Etaler
A flexable HTM (Hierarchical Temporal Memory) framework with full GPU support.
Stars: ✭ 79 (-25.47%)
Mutual labels:  tensor
pix
Interaction notation for UX design
Stars: ✭ 37 (-65.09%)
Mutual labels:  notation
soketi
Just another simple, fast, and resilient open-source WebSockets server. 📣
Stars: ✭ 2,202 (+1977.36%)
Mutual labels:  broadcasting

TensorCast.jl

Stable Docs Latest Docs Build Status

This package lets you work with multi-dimensional arrays in index notation, by defining a few macros which translate this to broadcasting, permuting, and reducing operations.

The first is @cast, which deals both with "casting" into new shapes (including going to and from an array-of-arrays) and with broadcasting:

@cast A[row][col] := B[row, col]        # slice a matrix B into rows, also @cast A[r] := B[r,:]

@cast C[(i,j), (k,ℓ)] := D.x[i,j,k,ℓ]   # reshape a 4-tensor D.x to give a matrix

@cast E[φ,γ] = F[φ]^2 * exp(G[γ])       # broadcast E .= F.^2 .* exp.(G') into existing E

@cast _[i] := isodd(i) ? log(i) : V[i]  # broadcast a function of the index values

@cast T[x,y,n] := outer(M[:,n])[x,y]    # generalised mapslices, vector -> matrix function

Second, @reduce takes sums (or other reductions) over the indicated directions. Among such sums is matrix multiplication, which can be done more efficiently using @matmul instead:

@reduce K[_,b] := prod(a,c) L.field[a,b,c]           # product over dims=(1,3), drop dims=3

@reduce S[i] = sum(n) -P[i,n] * log(P[i,n]/Q[n])     # sum!(S, @. -P*log(P/Q')) into exising S

@matmul M[i,j] := sum(k,k′) U[i,k,k′] * V[(k,k′),j]  # matrix multiplication, plus reshape

The same notation with @cast applies a function accepting the dims keyword, without reducing:

@cast W[i,j,c,n] := cumsum(c) X[c,i,j,n]^2           # permute, broadcast, cumsum(; dims=3)

All of these are converted into array commands like reshape and permutedims and eachslice, plus a broadcasting expression if needed, and sum / sum!, or * / mul!. This package just provides a convenient notation.

From version 0.4, it relies on TransmuteDims.jl to handle re-ordering of dimensions, and LazyStack.jl to handle slices. It should also now work with OffsetArrays.jl:

using OffsetArrays
@cast R[n,c] := n^2 + rand(3)[c]  (n in -5:5)        # arbitrary indexing

And it can be used with some packages which modify broadcasting:

using Strided, LoopVectorization, LazyArrays
@cast @strided E[φ,γ] = F[φ]^2 * exp(G[γ])           # multi-threaded
@reduce @turbo S[i] := sum(n) -P[i,n] * log(P[i,n])  # SIMD-enhanced
@reduce @lazy M[i,j] := sum(k) U[i,k] * V[j,k]       # non-materialised

Installation

using Pkg; Pkg.add("TensorCast")

The current version requires Julia 1.6 or later. There are a few pages of documentation.

Elsewhere

Similar notation is also used by some other packages, although all of them use an implicit sum over repeated indices. TensorOperations.jl performs Einstein-convention contractions and traces:

@tensor A[i] := B[i,j] * C[j,k] * D[k]      # matrix multiplication, A = B * C * D
@tensor D[i] := 2 * E[i] + F[i,k,k]         # partial trace of F only, Dᵢ = 2Eᵢ + Σⱼ Fᵢⱼⱼ

More general contractions are allowed by OMEinsum.jl, but only one term:

@ein Z[i,j,ξ] := X[i,k,ξ] * Y[j,k,ξ]        # batched matrix multiplication
Z = ein" ikξ,jkξ -> ijξ "(X,Y)              # numpy-style notation

Einsum.jl and Tullio.jl allow arbitrary (element-wise) functions:

@einsum S[i] := -P[i,n] * log(P[i,n]/Q[n])  # sum over n, for each i (also with @reduce above)
@einsum G[i] := 2 * E[i] + F[i,k,k]         # the sum includes everyting:  Gᵢ = Σⱼ (2Eᵢ + Fᵢⱼⱼ)
@tullio Z[i,j] := abs(A[i+x, j+y] * K[x,y]) # convolution, summing over x and y

Notice that @einsum and @tullio sum the entire right hand side, like @reduce does, while @tensor sums individual terms.

These produce very different code for actually doing what you request: The macros @tensor and @ein work out a sequence of basic tensor operations (like contraction and traces), while @einsum and @tullio write the necessary set of nested loops directly (plus optimisations). This package's macros @cast, @reduce and @matmul instead write everything in terms of whole-array operations (like reshape, permutedims and broadcasting).

For those who speak Python, @cast and @reduce allow similar operations to einshape or einops (minus the cool video, but plus broadcasting) while @matmul and (from other packages) @ein, @tensor are closer to einsum.

About

This was a holiday project to learn a bit of metaprogramming, originally TensorSlice.jl. But it suffered a little scope creep.

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].