All Projects → under-Peter → Tensornetworkad.jl

under-Peter / Tensornetworkad.jl

Licence: mit
Algorithms that combine tensor network methods with automatic differentiation

Programming Languages

julia
2034 projects

Projects that are alternatives of or similar to Tensornetworkad.jl

Kotlingrad
Shape-Safe Symbolic Differentiation with Algebraic Data Types
Stars: ✭ 388 (+618.52%)
Mutual labels:  automatic-differentiation
Autodiff
automatic differentiation made easier for C++
Stars: ✭ 641 (+1087.04%)
Mutual labels:  automatic-differentiation
Spago
Self-contained Machine Learning and Natural Language Processing library in Go
Stars: ✭ 854 (+1481.48%)
Mutual labels:  automatic-differentiation
Enzyme
High-performance automatic differentiation of LLVM.
Stars: ✭ 418 (+674.07%)
Mutual labels:  automatic-differentiation
Math
The Stan Math Library is a C++ template library for automatic differentiation of any order using forward, reverse, and mixed modes. It includes a range of built-in functions for probabilistic modeling, linear algebra, and equation solving.
Stars: ✭ 494 (+814.81%)
Mutual labels:  automatic-differentiation
Arraymancer
A fast, ergonomic and portable tensor library in Nim with a deep learning focus for CPU, GPU and embedded devices via OpenMP, Cuda and OpenCL backends
Stars: ✭ 793 (+1368.52%)
Mutual labels:  automatic-differentiation
Theano lstm
🔬 Nano size Theano LSTM module
Stars: ✭ 310 (+474.07%)
Mutual labels:  automatic-differentiation
Qualia2.0
Qualia is a deep learning framework deeply integrated with automatic differentiation and dynamic graphing with CUDA acceleration. Qualia was built from scratch.
Stars: ✭ 41 (-24.07%)
Mutual labels:  automatic-differentiation
Control Toolbox
The Control Toolbox - An Open-Source C++ Library for Robotics, Optimal and Model Predictive Control
Stars: ✭ 562 (+940.74%)
Mutual labels:  automatic-differentiation
Zygote.jl
Intimate Affection Auditor
Stars: ✭ 933 (+1627.78%)
Mutual labels:  automatic-differentiation
Pinocchio
A fast and flexible implementation of Rigid Body Dynamics algorithms and their analytical derivatives
Stars: ✭ 432 (+700%)
Mutual labels:  automatic-differentiation
Forwarddiff.jl
Forward Mode Automatic Differentiation for Julia
Stars: ✭ 466 (+762.96%)
Mutual labels:  automatic-differentiation
Pennylane
PennyLane is a cross-platform Python library for differentiable programming of quantum computers. Train a quantum computer the same way as a neural network.
Stars: ✭ 800 (+1381.48%)
Mutual labels:  automatic-differentiation
Gorgonia
Gorgonia is a library that helps facilitate machine learning in Go.
Stars: ✭ 4,295 (+7853.7%)
Mutual labels:  automatic-differentiation
Jax Fenics Adjoint
Differentiable interface to FEniCS for JAX using dolfin-adjoint/pyadjoint
Stars: ✭ 32 (-40.74%)
Mutual labels:  automatic-differentiation
Deep Learning From Scratch 3
『ゼロから作る Deep Learning ❸』(O'Reilly Japan, 2020)
Stars: ✭ 380 (+603.7%)
Mutual labels:  automatic-differentiation
Deeplearning.scala
A simple library for creating complex neural networks
Stars: ✭ 745 (+1279.63%)
Mutual labels:  automatic-differentiation
Quantumflow Dev
QuantumFlow: A Quantum Algorithms Development Toolkit
Stars: ✭ 43 (-20.37%)
Mutual labels:  automatic-differentiation
Autoppl
C++ template library for probabilistic programming
Stars: ✭ 34 (-37.04%)
Mutual labels:  automatic-differentiation
Owl
Owl - OCaml Scientific and Engineering Computing @ http://ocaml.xyz
Stars: ✭ 919 (+1601.85%)
Mutual labels:  automatic-differentiation
TensorNetworkAD logo

TensorNetworkAD

Stable Dev Build Status Codecov

This is a repository for the Google Summer of Code project on Differentiable Tensor Networks.

In this package we implemented the algorithms described in Differentiable Programming Tensor Networks, namely implementing automatic differentiation (AD) on Corner Transfer Matrix Renormalization Group (CTMRG) and Tensor Renormalization Group (TRG), demonstrating two applications:

  • Gradient based optimization of iPEPS
  • Direct calculation of energy densities in iPEPS via derivatives of the free energy

More generally we aim to provide Julia with the tools to combine AD and tensor network methods.

Suggestions and Comments in the Issues are welcome.

Example

Since this package was inspired by the Differentiable Programming Tensor Networks paper, we demonstrate the usage of those algorithms in the following.

Free Energy of the 2D Classical Ising Model

We start by constructing the tensor for the tensor network representation of the 2d classical Ising Model. This tensor can be constructed using the model_tensor-function that takes a model-parameter - in our case Ising() - and an inverse temperature β (e.g. at β=0.5).

julia> a = model_tensor(Ising(), 0.5)
2×2×2×2 Array{Float64,4}:
[:, :, 1, 1] =
 2.53434  0.5    
 0.5      0.18394

[:, :, 2, 1] =
 0.5      0.18394
 0.18394  0.5    

[:, :, 1, 2] =
 0.5      0.18394
 0.18394  0.5    

[:, :, 2, 2] =
 0.18394  0.5    
 0.5      2.53434

Using the trg function, we can calculate the partition function of the model per site:

julia> trg(a, 20,20)
1.0257933734351765

which grows bond-dimensions up to 20 and does 20 iterations, i.e. grows the system to a size of 2^20 which is well converged for our purposes.

Given the partition function, we get the free energy as the first derivative with respect to β times -1. With Zygote, this is straightforward to calculate:

julia> using Zygote: gradient

julia>  = gradient(β -> trg(model_tensor(Ising(), β), 20,20), 0.5)[1]
1.7455677143228514

julia> -
-1.7455677143228514

which agrees with the data presented in the paper.

Finding the Ground State of infinite 2D Heisenberg model

The other algorithm variationally minimizes the energy of a Heisenberg model on a two-dimensional infinite lattice using a form of gradient descent.

First, we need the hamiltonian as a tensor network operator

julia> h = hamiltonian(Heisenberg())
2×2×2×2 Array{Float64,4}:
[:, :, 1, 1] =
 -0.5  0.0
  0.0  0.5

[:, :, 2, 1] =
  0.0  0.0
 -1.0  0.0

[:, :, 1, 2] =
 0.0  -1.0
 0.0   0.0

[:, :, 2, 2] =
 0.5   0.0
 0.0  -0.5

where we get the Heisenberg-hamiltonian with default parameters Jx = Jy = Jz = 1.0. Next we initialize an ipeps-tensor and calculate the energy of that tensor and the hamiltonian:

julia> ipeps = SquareIPEPS(rand(2,2,2,2,2));

julia> ipeps = TensorNetworkAD.indexperm_symmetrize(ipeps);

julia> TensorNetworkAD.energy(h,ipeps, χ=20, tol=1e-6,maxit=100)
-0.5278485155836766

where the initial energy is random.

To minimise it, we combine Optim and Zygote under the hood to provide the optimiseipeps function.

julia> using Optim

julia> res = optimiseipeps(ipeps, h; χ=20, tol=1e-6, maxit=100,
        optimargs = (Optim.Options(f_tol=1e-6, show_trace=true),));
Iter     Function value   Gradient norm
     0    -5.015158e-01     2.563357e-02
 * time: 4.100799560546875e-5
     1    -6.171409e-01     3.170732e-02
 * time: 0.3943500518798828
     2    -6.558814e-01     2.927539e-02
 * time: 0.6722378730773926
     3    -6.577320e-01     1.299056e-02
 * time: 1.0529990196228027
     4    -6.587514e-01     8.515789e-03
 * time: 1.2889769077301025
     5    -6.595896e-01     1.102446e-02
 * time: 1.5059330463409424
     6    -6.599735e-01     2.020418e-03
 * time: 1.8917429447174072
     7    -6.600449e-01     4.343536e-03
 * time: 2.180701971054077
     8    -6.601202e-01     2.623793e-03
 * time: 2.5907390117645264
     9    -6.602188e-01     3.951503e-04
 * time: 2.9895379543304443
    10    -6.602232e-01     2.597750e-04
 * time: 3.254667043685913
    11    -6.602246e-01     2.960359e-04
 * time: 3.4899749755859375
    12    -6.602282e-01     2.846450e-04
 * time: 3.739893913269043
    13    -6.602290e-01     1.679273e-04
 * time: 3.9142658710479736
    14    -6.602303e-01     2.155790e-04
 * time: 4.230381011962891
    15    -6.602311e-01     2.239934e-05
 * time: 4.5699989795684814
    16    -6.602311e-01     1.935087e-05
 * time: 4.837096929550171

where our final value for the energy e = -0.6602 agrees with the value found in the paper.

License

MIT License

Note that the project description data, including the texts, logos, images, and/or trademarks, for each open source project belongs to its rightful owner. If you wish to add or remove any projects, please contact us at [email protected].