APS Logo

Self-Supervised Learning of Generative Spin-Glasses with Normalizing Flows

POSTER

Abstract

We develop generative spin-glass models that can be used to represent multi-scale phenomena in physics and computer science. These systems are parametrized in terms of Normalizing Flows, a class of deep neural network-based generative models. Normalizing Flows are designed to operate on continuous variables, whereas our focus is on discretely-valued spin systems. Therefore, we first provide a continuous formulation of spin-glasses and convert the discrete Boltzmann distributions into physically equivalent continuous distributions. We then machine learn generative models of these systems in a self-supervised fashion wherein the training data is derived from the system itself. Within this self-supervised framework, we explore two alternative methods for training the normalizing flow. The first approach minimizes the standard or forward Kullback-Leibler (KL) divergence and is able to capture key physical characteristics of the spin-glass phase, including a non-trivial overlap order parameter and ultrametricity. In contrast, an alternative approach based on minimizing the reverse KL divergence, where the order of the arguments is reversed relative to the forward case, could suffer from mode collapse and fail to capture these properties. We also consider hybrid approaches designed to overcome the mode collapse problem while still retaining the efficient data generation aspect of the reverse KL minimization approach.

Publication: https://arxiv.org/abs/2001.00585<br>https://arxiv.org/abs/2001.00927

Presenters

  • Gavin S Hartnett

    Rand Corp

Authors

  • Gavin S Hartnett

    Rand Corp

  • Masoud Mohseni

    Google LLC