APS Logo

Trainability of Quantum Neural Networks: Barren Plateaus and Scalability

Invited

Abstract

Quantum neural networks (QNNs) have generated excitement around the possibility of efficiently analyzing quantum data. However, one of the key open questions is how well will QNNs scale. The past year has witnessed significant analytical progress on studying the scaling of gradients in QNNs. It was recently discovered that certain QNN architectures can exhibit exponentially vanishing gradients, known as barren plateau landscapes. This leads to exponential scaling in the required precision on the gradient, making the training process inefficient. Nevertheless, some techniques have been shown to avoid barren plateaus, such as correlating parameters, employing local cost functions, and keeping the circuit depth shallow. In this talk, we will discuss recent progress in understanding the barren plateau phenomenon in QNNs, focusing especially on analytical gradient scaling results for QNNs.

Presenters

  • Patrick Coles

    Los Alamos National Laboratory, Theoretical Division, Los Alamos National Laboratory, T-Division, Los Alamos National Laboratory

Authors

  • Patrick Coles

    Los Alamos National Laboratory, Theoretical Division, Los Alamos National Laboratory, T-Division, Los Alamos National Laboratory