Signatures of double descent in deep quantum models
ORAL
Abstract
Deep neural networks show amazing generalization properties despite being highly over-parameterized. They show a transition past an interpolation point where despite fitting every training data point perfectly, the generalization error reduces again as the number of parameters is increased. This violates the long-standing bias-variance trade-off phenomenon. We present thorough empirical evidence of this “double descent” phenomenon in deep quantum neural networks. We also present various tools to study this transition such as changes in bias and variance, Quantum Neural Tangent Kernel, and Fisher information. We also try to understand where quantum deep learning is feasible and propose efficient anzatses. This is the first step towards using deep quantum models to avoid barren plateaus and achieve fast convergences.
–
Presenters
-
Aroosa Ijaz
Univeristy of Waterloo
Authors
-
Aroosa Ijaz
Univeristy of Waterloo
-
Jason W Rocks
Boston University
-
Juan Carrasquilla
Vector Institute for Artificial Intelligence
-
Evan Peters
University of Waterloo
-
Marco Cerezo
Los Alamos National Laboratory