Quantum-inspired Tree Tensor Network Methods for Compression and Transformation of Multivariate Functions
ORAL
Abstract
Tensor networks are a powerful tool for compressing high-dimensional data by exploiting the low-rank structure inherent in many physical systems. When discretized, functions of real or complex variables can also be viewed as high-dimensional data, enabling a tensor network-based "quantization." In a 1D layout this is known as the quantized tensor train (QTT) format. QTTs have been used to obtain compressed representations of high order perturbative expansions used in quantum many-body physics and to solve the incompressible Navier-Stokes equations in turbulent fluid dynamics. In this work, we explore the power of the more general family of Tree Tensor Networks (TTNs) as a compressed representation of multivariate functions. We present novel methods for constructing multivariate functions in TTN format via Fourier and Chebyshev interpolation and benchmark these against Tensor Cross Interpolation (TCI), an active tensor network learning algorithm. Furthermore, we examine how different TTN tree layouts influence rank-efficiency for specific functions, developing quantitative heuristics for layouts that yield the most compact representations.
–
Presenters
-
Ryan Anselm
Simons Foundation (Flatiron Institute)
Authors
-
Ryan Anselm
Simons Foundation (Flatiron Institute)
-
Ryan Levy
Simons Foundation (Flatiron Institute)
-
Joseph Antony Tindall
Simons Foundation (Flatiron Institute)