Learnability and Complexity of Quantum Sample
ORAL
Abstract
Given a quantum circuit, a quantum computer can sample the output distribution exponentially faster in the number of bits than classical computers. A similar exponential separation has yet to be established in generative models through quantum sample learning: given samples from an n-qubit computation, can we learn the underlying quantum distribution using models with training parameters that scale polynomial in n under a fixed training time? We study four kinds of generative models: Deep Boltzmann machine, Generative Adversarial Networks (GANs), Long Short-Term Memory (LSTM) and Autoregressive GAN, on learning quantum data set generated by deep random circuits. We demonstrate the autoregressive structure present in the underlying quantum distribution from random quantum circuits. Both numerical experiments and a theoretical proof in the case of the DBM show exponentially growing complexity of learning-agent parameters required for achieving a fixed accuracy as n increases. Finally, we establish a connection between learnability and the complexity of generative models by benchmarking learnability against different sets of samples drawn from probability distributions of variable degrees of complexities in their quantum and classical representations.
–
Presenters
-
Murphy Yuezhen Niu
Google AI Quantum, Google Quantum AI, Google Inc
Authors
-
Murphy Yuezhen Niu
Google AI Quantum, Google Quantum AI, Google Inc
-
Andrew Dai
Google Health
-
Li Li
Google Research, Google Inc
-
Vadim Smelyanskiy
Google AI Quantum, Google Quantum AI, Google - Venice, CA, Google Inc - Santa Barbara
-
Hartmut Neven
Google AI Quantum, Google Quantum AI, Google LLC, Google - Venice, CA
-
Sergio Boixo
Google Quantum AI, Google LLC