APS Logo

Scalable Quantum State Tomography with Attention Network

ORAL

Abstract

The problem of many-body wavefunction reconstruction, which suffers from exponential scaling in system size as well as noisy state preparation and measurement, remains a major obstacle to the study of intermediate-scale quantum systems. Recent works found success by recasting the problem of reconstruction to learning the probability distribution of quantum state measurement vectors, a natural task for generative neural network models. Networks based on the attention mechanism, designed to learn long-range correlations in natural language sentences, appear especially well-suited to the task of learning highly entangled wavefunctions. In this work, we demonstrate that an attention mechanism-based generative network, based on the model proposed in ``Attention is all you need’’ by Vishwani et al (2017), can outperform previous neural network based approaches to quantum state tomography. Specifically, in addition to working with state-of-the-art system sizes, the attention mechanism is able to accommodate noise by directly reconstructing the density matrix of mixed states. This work represents an important step forward in the applicability of machine learning to the study of noisy intermediate-scale quantum systems.

Presenters

  • Peter Cha

    Cornell University

Authors

  • Peter Cha

    Cornell University

  • Juan Carrasquilla

    Vector Institute, Vector Institute for Artificial Intelligence

  • Paul Ginsparg

    Cornell University

  • Eun-Ah Kim

    Cornell University