APS Logo

Zipf's criticality in learning systems

ORAL

Abstract



























Many high-dimensional complex systems, including biological ones such as populations of neurons, exhibit Zipf's law. That is, the $r$-th most frequently observed value is seen with frequency proportional to $1/r$. Although this has been proposed to be a signature of fine-tuning, previous work shows that the Zipf's law can also emerge from a generic coupling between an observed system with many degrees of freedom and an unobserved fluctuating variable. In this context, the emergence of Zipf's law can be related to the fact that an observation of the large system tightly constrains the values of the unobserved variables. Recently, Zipf's law has been observed in the distribution of functions which may be produced by a neural network of a given architecture. We show that these results hold true for many learning machines (not necessarily deep networks) in regimes where learning is possible. This relates the observation of Zipf's law to the ability of a system to learn a model from data, also suggesting ways to improve learning algorithms.


























Presenters

  • Sean A Ridout

    Emory University

Authors

  • Sean A Ridout

    Emory University

  • Ilya M Nemenman

    Emory, Emory University