Quantum-inspired nonlocal parallel tempering with approximate tensor network contractions
ORAL
Abstract
We use tensor networks to represent probability distribution of spin-glass models that are encoding combinatorial optimization problems. We then develop greedy approximation tensor contraction for efficient Gibbs sampling of the low energy states for structured quasi-2D spin-glass models. Our deterministic approach can reveal certain geometrical properties of such hard optimization problems known as “droplet excitations”. The knowledge of such droplets can be used to enhance traditional probabilistic Monte Carlo techniques. Specifically we provide two new cluster Monte Carlo techniques in which we combine tensor network contractions with “parallel tempering” (PT). In the first method the tensor networks provide certain high-quality initial states for PT instead of traditional random restarts. In the second method, the tensor network contraction provide droplet information that can be used for dynamical cluster updates unfreezing low-temperature replicas near a computational phase transition. We demonstrate several orders of magnitude improvement in time-to-solutions in comparison to local Monte Carlo techniques such as PT. We also observe significant improvement over commercial solvers, such as Gurobi, or other heuristic cluster PT approaches powered by iso-energetic moves.
–
Presenters
-
Daniel Eppens
Google Inc.
Authors
-
Masoud Mohseni
Google AI, Google Inc., Google Inc, Google Research, Google Quantum AI Laboratory
-
Daniel Eppens
Google Inc.
-
Marek M Rams
Marian Smoluchowski Institute of Physics,, Jagiellonian University, Jagiellonian University
-
Sergio Boixo
Google Inc., Google, Inc., Google
-
Hartmut Neven
Google Inc., Google AI Quantum, Google Inc, Google