APS Logo

Progress in Machine Learning with Tensor Networks

Invited

Abstract

Tensor networks are well known to physicists in theoretical condensed matter and other areas of physics as a tool for solving strongly-correlated electron problems, but they are actually a general technique to compress large tensors, similar to low-rank factorizations of matrices. In recent years tensor networks have found diverse uses within applied mathematics, and some of these ideas are making their way back into physics again. After reviewing tensor networks, I will discuss recent progress in using them as a type of machine learning model, in contrast to neural networks or other model families. Tensor networks can give state of the art results, but what is most exciting is the progress in algorithm development and theory of machine learning that they make possible.

Presenters

  • Edwin Stoudenmire

    Center for Computational Quantum Physics, Flatiron Institute, Simons Foundation

Authors

  • Edwin Stoudenmire

    Center for Computational Quantum Physics, Flatiron Institute, Simons Foundation