APS Logo

Arithmetic tensor network for multi-variable function integration

POSTER

Abstract

Exact integration of a discretized d-variable function by summing over all grid points requires a computational cost exponential in d. On the other hand, tensor networks (TN) are known for representing high-dimensional functions with polynomial memory. We propose a TN ansatz for representing general high-dimensional functions and their integration. Unlike the standard variational/time-evolution approach well-known to the physical TN community or an optimization/fitting approach, we obtain the TN for the polynomial approximation of the high-dimensional function directly as a network of fixed small tensors. We call the resulting TN an arithmetic TN since it is analogous to the classical binary circuit which computes function value via arithmetic operation with a set of known gates. (Approximate) integration of the function is done by tracing over the external legs of the arithmetic TN and approximately contracting the resulting closed TN. We give numerical examples of approximately integrating polynomials of a specific form, quadratic Gaussians with quartic perturbations, and feed-forward neural networks -- although the same idea in principle applies to any function that admits polynomial approximation.

Presenters

  • Ruojing Peng

    Caltech

Authors

  • Ruojing Peng

    Caltech

  • John C Gray

    Caltech

  • Garnet Chan

    Caltech, Cal Tech