Interplay of local and induced decoherence in circuit-QED: Experiment
ORAL
Abstract
It has long been known in the field that the lifetime of a superconducting qubit (T1) can change dramatically during dispersive measurement and that it typically changes for the worse. A T1 event during readout results in a nonQND readout error, reducing readout fidelity. Increasingly quantum algorithms require conditional (or mid-circuit) measurements, in which an operation is performed on the qubit based on the measurement result. A T1 event during a mid-circuit measurement can result in the wrong feedback being given to the qubit. While several theories have been put forward to explain the change in T1 during measurement, there has been a lack of experimental evidence for or against these theories. In this talk we put forward a new explanation for T1 reduction during dispersive readout. The modification of the qubit T1 can be described within a self-consistent theoretical framework for calculating engineered dissipation on the qubit post-adiabatic elimination of the readout resonator. We support this new theory with experimental evidence on an IBM Quantum processor.
–
Presenters
-
Ted Thorbeck
IBM Quantum
Authors
-
Ted Thorbeck
IBM Quantum
-
Zhihao Xiao
University of Massachusetts Lowell, University of Massachusetts-Lowell
-
Juzar Thingna
University of Massachusetts Lowell, University of Massachusetts-Lowell
-
Archana Kamal
University of Massachusetts-Lowell, University of Massachusetts Lowell
-
Luke Govia
IBM Quantum