Scalable Boson Sampling with Noisy Components

ORAL

Abstract

The goal of a Boson Sampler is to efficiently and scalably sample from a probability distribution that cannot be simulated efficiently on a classical computer, thus violating the Extended Church-Turing Thesis (ECTT). To properly falsify the ECTT, the physical device must do so even in the face of realistic noise. Scaling a Boson Sampler requires increasing quantities of a set of fixed-size components (beamsplitters, detectors, etc.), so it is natural to consider noise models that act on each component independently. We show that for any such model, the per-component noise need only decrease polynomially to keep the sampling problem hard. In this sense, Boson Sampling with noise is scalable. However, the same result applies to a number of other quantum information systems, including universal circuit-model quantum computers. Such devices are widely believed to require error correction in order to be truly scalable, even though polynomial reduction of per-component errors would allow them to work without error correction. This belief is consistent with the stricter requirement that error rates should be not just polynomially small, but constant in problem size. We conclude that a more precise definition of scalability with noise is needed to properly evaluate Boson Samplers.

Authors

  • Tyler Keating

    University of New Mexico

  • Joseph Slote

    Carleton College

  • Gopikrishnan Muraleedharan

    University of New Mexico

  • Ezequiel Carrasco

    University of New Mexico

  • Ivan Deutsch

    University of New Mexico