APS Logo

Generative quantum models over tensor network architectures

ORAL

Abstract

Generative models are able to produce new data according to an underlying probability distribution that they learn from a given data set. Inspired by the probabilistic nature of quantum mechanics, we employ a generative model, known as the "Born machine". which uses quantum state representation and learns the joint probabilities over such quantum degrees of freedom. To represent the quantum states we train tensor network architectures that could provide efficient expressivity, training, and sampling when the quantum probability distribution has some local structure. Specifically, we train two types of tensor networks known as matrix product states (MPS) and tree tensor network (TTN) over both classical and quantum data. We first show that our TTN model can generate the MNIST handwritten digits efficiently. In the next step, we variationally train tensor network models to generate desired quantum entanglement produced by shallow quantum circuits given iterative input-output information from actual quantum hardware with typically unknown systematic and random errors.

Presenters

  • Khadijeh Najafi

    Virginia Tech

Authors

  • Khadijeh Najafi

    Virginia Tech

  • Ahmadreza Azizi

    Virginia Tech

  • Carlos Fuertes

    Waymo

  • Miles Stoudenmire

    Flatiron Institute, Simons Foundation, Center for Computational Quantum Physics, Flatiron Institute

  • Masoud Mohseni

    Google AI, Google Inc., Google Inc, Google Research, Google Quantum AI Laboratory