Representation Learning via Quantum Neural Tangent Kernels
ORAL
Abstract
Variational quantum circuits are used in quantum machine learning and variational quantum simulation tasks. Designing good variational circuits or predicting how well they perform for given learning or optimization tasks is still unclear. In this paper, we address these problems, studying variational quantum circuits using the theory of neural tangent kernels. We define quantum neural tangent kernels, and derive the dynamical equation of their loss function in optimization and learning tasks. We define and analyze quantum neural tangent kernels in the frozen limit, where their variational angles change slowly and a linear perturbation of the variational angles is good enough to describe the dynamics, which is commonly known in machine learning as the lazy training regime. We then extend the analysis to a dynamical setting, including quadratic corrections in the variational angles. We define a large width limit for quantum kernels, showing that a hybrid quantum-classical neural network can be approximately Gaussian. Our results elucidate a regime in which an analytical understanding of the training dynamics for variational quantum circuits, used for quantum machine learning and optimization problems, is possible.
–
Publication: appear soon in arxiv in this year.
Presenters
-
Junyu Liu
University of Chicago
Authors
-
Junyu Liu
University of Chicago
-
Francesco Tacchino
IBM Research Zurich, IBM Quantum, IBM Research Zurich
-
Jennifer R Glick
IBM Quantum, IBM TJ Watson Research Center, IBM
-
Liang Jiang
University of Chicago
-
Antonio Mezzacapo
IBM