Variational Neural Annealing
Invited
Abstract
Many combinatorial optimization problems relevant to computer science, computational biology, and physics can be tackled with simulated annealing, which is a powerful framework for optimizing the properties of complex systems through the lens of statistical mechanics. However, simulated annealing and its quantum counterpart, simulated quantum annealing, are traditionally implemented via Markov chain Monte Carlo, often displaying slow convergence to optimal solutions for challenging optimization problems. In this talk, we present a combination of the variational principle in classical and quantum physics with recurrent neural networks (RNNs), whose dynamics are naturally devoid of slow Markov chains, to accurately emulate annealing in its classical and quantum formulations, for the purpose of solving optimization problems. We find that our variational implementation of classical annealing is not only superior to its quantum analog in terms of speed of convergence and accuracy of solutions but also outperforms traditional simulated annealing and simulated quantum annealing on prototypical spin glass models. These results advocate for the use of our variational implementation of classical annealing as a competitive algorithm to tackle real-world optimization problems.
–
Presenters
-
Mohamed Hibat-Allah
University of Waterloo, Vector Institute for Artificial Intelligence
Authors
-
Mohamed Hibat-Allah
University of Waterloo, Vector Institute for Artificial Intelligence
-
Estelle Inack
Perimeter Inst for Theo Phys, Perimeter Institute
-
Roeland Cornelis Wiersema
Vector Institute for Artificial Intelligence
-
Roger G Melko
University of Waterloo
-
Juan Carrasquilla
Vector Institute for Artificial Intelligence