QuGStep: Refining Step Size Selection in Gradient Estimation for Quantum Optimization
ORAL
Abstract
Variational quantum algorithms (VQAs) offer a promising approach to optimization on near-term quantum devices, but efficient gradient estimation remains challenging due to limited measurement (shot) resources on noisy intermediate-scale quantum (NISQ) hardware. In this talk, I will introduce QuGStep, an adaptive method based on the shot budget for optimizing step size in gradient estimation, reducing measurements while maintaining effective convergence. QuGStep combines theoretical derivation with experimental validation on molecular systems, and the results show that QuGStep reduces measurement requirements by over 90% compared to fixed-step methods, making it highly effective for NISQ devices. This advancement significantly improves the efficiency of VQAs, contributing to the broader goal of practical quantum optimization on current hardware.
–
Presenters
-
Linghua Zhu
University of Washington
Authors
-
Linghua Zhu
University of Washington
-
Senwei Liang
Lawrence Berkeley National Laboratory
-
Xiaosong Li
University of Washington
-
Chao Yang
Lawrence Berkeley Lab, Lawrence Berkeley National Laboratory