Differentiable Programming Tensor Networks

164Citations
Citations of this article
178Readers
Mendeley users who have this article in their library.

Abstract

Differentiable programming is a fresh programming paradigm which composes parameterized algorithmic components and optimizes them using gradient search. The concept emerges from deep learning but is not limited to training neural networks. We present the theory and practice of programming tensor network algorithms in a fully differentiable way. By formulating the tensor network algorithm as a computation graph, one can compute higher-order derivatives of the program accurately and efficiently using automatic differentiation. We present essential techniques to differentiate through the tensor networks contraction algorithms, including numerical stable differentiation for tensor decompositions and efficient backpropagation through fixed-point iterations. As a demonstration, we compute the specific heat of the Ising model directly by taking the second-order derivative of the free energy obtained in the tensor renormalization group calculation. Next, we perform gradient-based variational optimization of infinite projected entangled pair states for the quantum antiferromagnetic Heisenberg model and obtain state-of-the-art variational energy and magnetization with moderate efforts. Differentiable programming removes laborious human efforts in deriving and implementing analytical gradients for tensor network programs, which opens the door to more innovations in tensor network algorithms and applications.

Cite

CITATION STYLE

APA

Liao, H. J., Liu, J. G., Wang, L., & Xiang, T. (2019). Differentiable Programming Tensor Networks. Physical Review X, 9(3). https://doi.org/10.1103/PhysRevX.9.031041

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free