APS Logo

Concentration for Trotter error

POSTER

Abstract

Quantum simulation is one promising application of quantum computers. Product formulas, or Trotterization, is the oldest and still today one of the most studied methods for quantum simulations, due to their relatively simple implementation without ancillae. For an accurate approximation in the spectral norm, the gate complexity of the state-of-the-art product formulas depends on the number of terms in the Hamiltonian and a certain 1-norm of its local term coefficients.

    This work considers the concentration aspects of Trotter error: we show quantitatively that the Trotter error exhibits 2-norm scaling ``typically'', with the existing estimates in 1-norm being for the ``worst'' cases. For general k-local Hamiltonians, we obtain gate count estimates for input states drawn from a 1-design ensemble (e.g. computational basis states). Our gate count depends on the number of terms in the Hamiltonian but replaces the 1-norm quantity by its analog in 2-norm, giving significant speedup for systems with large connectivity. Our concentration results generalize to Hamiltonians with fermionic terms and when the input state is restricted to a low-particle number subspace. Further, when the Hamiltonian itself has random coefficients, such as the SYK models, we show the stronger result that the 2-norm behavior persists even for the worst input state.

    Our main technical tool is a family of simple but versatile inequalities from non-commutative martingales called uniform smoothness. We use them to derive Hypercontractivity, i.e. p-norm estimates for low-degree polynomials, which implies concentration via Markov's inequality. In terms of optimality, we give examples that simultaneously match our p-norm estimates and the spectral norm estimates. This shows our improvement is due to asking a qualitatively different question than the one asked in the spectral norm bounds. Our results give evidence that product formulas in practice may work much better than expected.

Presenters

  • Chi-Fang Chen

    Caltech

Authors

  • Chi-Fang Chen

    Caltech

  • Fernando Brandao

    Caltech, Amazon