Deep generative spin-glass models with normalizing flows
ORAL
Abstract
We develop and train a novel universal class of deep spin-glass models that can learn to represent multiscale phenomena in physics and computer science including critical phenomena, discrete optimization, and probabilistic inference in graphical models. To this end, we first provide a continuous formulation of spin-glasses and convert the discrete Boltzmann distributions into physically equivalent continuous distributions. We then use recent techniques in deep learning known as “Normalizing Flows” to generate new low-energy states of such complex systems below spin-glass phase transitions. In particular, we demonstrate that the real non-volume preserving flows can be successfully trained to generate complex spin-glass distributions. We explore two alternative methods for training the normalizing flow based on minimizing reverse and forward Kullback-Leibler divergence. Moreover, we show how the problem of mode collapse for such deep generative models can be overcome at or below a critical point.
–
Presenters
-
Masoud Mohseni
Google AI, Google Inc., Google Inc, Google Research, Google Quantum AI Laboratory
Authors
-
Masoud Mohseni
Google AI, Google Inc., Google Inc, Google Research, Google Quantum AI Laboratory
-
Gavin Hartnett
Engineering and Applied Sciences, RAND Corporation, Rand Cooperation