Hamiltonian Algorithms
ORAL · Invited
Abstract
We consider two nearly optimal algorithms about local Hamiltonians. The first is to implement the real time evolution unitary generated by a lattice Hamiltonian on a general purpose quantum computer. Our algorithm simulates the time evolution of such a Hamiltonian on n qubits for time T up to error ϵ using O(nTpolylog(nT/ϵ)) gates with depth O(Tpolylog(nT/ϵ)). Our algorithm is the first simulation algorithm that achieves gate cost quasilinear in nT and polylogarithmic in 1/ϵ. We also prove a matching lower bound on the gate count of such a simulation, showing that any quantum algorithm that can simulate a piecewise constant bounded local Hamiltonian in one dimension to constant error requires Ω(nT) gates in the worst case. Our algorithm is based on a decomposition of the time-evolution unitary into a product of small unitaries using Lieb-Robinson bounds. The second is to learn a local Hamiltonian given a short time evolution unitary channel in a blackbox. We show how to learn the coefficients of a lattice Hamiltonian to error ϵ with query complexity O(log n / t ϵ^2 polylog(1/ϵ)) and classical, postprocessing time complexity linear in the product of the query count and the number of qubits. This uses a convergent series expansion of the real time evolution operator in evolution time t, and a quantum device is only required to measure local Pauli operators.
–
Publication: https://arxiv.org/abs/2108.04842; https://arxiv.org/abs/1801.03922
Presenters
-
Jeongwan Haah
Microsoft
Authors
-
Jeongwan Haah
Microsoft