On the descriptive power of Neural Networks as constrained Tensor Networks with exponentially large bond dimension

6Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

In many cases, neural networks can be mapped into tensor networks with an exponentially large bond dimension. Here, we compare different sub-classes of neural network states, with their mapped tensor network counterpart for studying the ground state of short-range Hamiltonians. We show that when mapping a neural network, the resulting tensor network is highly constrained and thus the neural network states do in general not deliver the naive expected drastic improvement against the state-of-the-art tensor network methods. We explicitly show this result in two paradigmatic examples, the 1D ferromagnetic Ising model and the 2D antiferromagnetic Heisenberg model, addressing the lack of a detailed comparison of the expressiveness of these increasingly popular, variational ansätze.

Cite

CITATION STYLE

APA

Collura, M., Dell’Anna, L., Felser, T., & Montangero, S. (2021). On the descriptive power of Neural Networks as constrained Tensor Networks with exponentially large bond dimension. SciPost Physics Core, 4(1). https://doi.org/10.21468/SciPostPhysCore.4.1.001

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free