Statistical physics insights on learning in high dimensions
ORAL · Invited
Abstract
The very purpose of physics is to come to an understanding of empirically observed behaviour. From this point of view, the current success of machine learning provides a myriad of yet mysterious empirical observations that call for explanation, in particular in high-dimensional non-convex settings. Inspired by physics, where simple models are at the core of our theoretical understanding of the world, we study models of neural networks that are simple yet able to capture the salient features of real systems. In this talk, I will present several high-dimensional and non-convex statistical learning problems and I will highlight the importance of the associated theoretical questions. The common point of these settings is that the data come from a probabilistic generative model leading to problems for which, in the high-dimensional limit, statistical physics provides exact closed solutions for the performance of gradient-based algorithms as well as the optimally-achievable performance, taken as a benchmark. I will describe some of our recent progress in the hunt for suitable models to study how the interplay between data and optimisation strategy can result in efficient learning.
–
Presenters
-
Francesca Mignacco
Institute of Theoretical Physics, CEA Saclay
Authors
-
Francesca Mignacco
Institute of Theoretical Physics, CEA Saclay