Sign problem in tensor network contraction
ORAL
Abstract
We investigate how the computational difficulty of contracting tensor networks depends on the sign structure of the tensor entries. We pursue this question by studying random tensor networks with varying bias towards positive entries. First, we consider contraction via Monte Carlo sampling, and find that the transition from hard to easy occurs when the entries become predominantly positive; this can be seen as a tensor network manifestation of the Quantum Monte Carlo sign problem. Second, we analyze the commonly used contraction based on boundary tensor networks. Its performance is governed by the amount of correlations (entanglement) in the tensor network. We find that the transition from a volume law to a boundary law scaling of entanglement occurs already for a slight bias towards a positive mean, and the earlier the larger the bond dimension is. This is in contrast to both expectations and the behavior found in Monte Carlo contraction. We gain further insight into this early transition from the study of an effective statmech model. Finally, we investigate the computational difficulty of computing expectation values of PEPS, where we find that the complexity of entanglement-based contraction always remains low. We explain this with a local transformation which maps PEPS expectation values to a positive-valued tensor network. This also suggests new approaches towards PEPS contraction based on positive decompositions.
–
Publication: https://arxiv.org/abs/2404.19023, https://arxiv.org/abs/2410.05414
Presenters
-
Jielun Chen
Caltech
Authors
-
Jielun Chen
Caltech
-
Jiaqing Jiang
Caltech
-
Dominik Hangleiter
University of California, Berkeley, University of Maryland College Park
-
Norbert Schuch
Univ of Vienna, University of Vienna