APS Logo

Machine Learning with Tensor Networks

Invited

Abstract

Tensor networks are a technique developed to compress otherwise exponentially large, many-body wavefunctions into a form such that they can be optimized and their properties computed with only polynomial effort. Key examples including the matrix product state (MPS) and projected entangled pair state (PEPS) tensor networks, which give state-of-the-art results for challenging systems of strongly correlated electrons. But in recent years, it has been appreciated that tensor networks are a broader technique for compressing very large linear functions, which can appear in many different problem domains. One very promising domain is machine learning of real-world data, where certain powerful models closely resemble wavefunction tensors and where tensor networks can be directly applied, just as in physics.

I will give a progress update on work over the last few years bringing tensor network methods to bear on machine learning. In addition to empirical benchmark results on challenging data sets, there has been recent progress in obtaining theoretical results for machine learning too. I will discuss how a theory of generative modeling of data can be developed in the context of matrix product state models. Other interesting developments include a theoretical understanding of the expressive power of various families of models, and a direct connection between tensor networks and proposals for quantum machine learning.

Presenters

  • Edwin Stoudenmire

    CCQ, Flatiron Institute

Authors

  • Edwin Stoudenmire

    CCQ, Flatiron Institute