How deep neural networks learn thermal phase transitions
ORAL
Abstract
Machine-learning methods have successfully been used to identify phase transitions from data. Neural network (NN)-based approaches are particularly appealing due to the ability of NNs to learn arbitrary functions. However, the larger an NN, the more computational resources are needed to train it, and the more difficult it is to understand its decision making. We derive analytical expressions for the optimal output of three popular NN-based methods for detecting phase transitions which rely on solving classification and regression tasks using supervised learning at their core [1]. This result corresponds to the output when using optimal predictive models, such as sufficiently large NNs after ideal training. Our analysis reveals that high-capacity neural networks ultimately gauge changes in the energy distributions as a function of temperature when used to detect thermal phase transitions. In contrast, low-capacity neural networks seem to learn order parameters, i.e., recognize prevalent patterns or orderings. Our theoretical findings are supported by analyzing data from numerical simulations of classical spin systems.
–
Publication: [1] Julian Arnold and Frank Schäfer, Phys. Rev. X 12, 031044 (2022)
Presenters
-
Frank Schäfer
CSAIL, Massachusetts Institute of Technology
Authors
-
Julian Arnold
Department of Physics, University of Basel
-
Frank Schäfer
CSAIL, Massachusetts Institute of Technology