APS Logo

Renormalized Mutual Information for Artificial Scientific Discovery

ORAL

Abstract

Ranging from statistical physics to hydrodynamics, collective coordinates are one of the most useful general concept in the analysis of physical systems. However, there is no general procedure to find them, they are usually engineered manually. These low-dimensional "features" can be defined as low dimensional variables that preserve the largest mutual information with the original coordinates of the system. However, mutual information is ill-defined when one continuous random variable is deterministically dependent on the other. We develop a new “renormalized” version, with the same physical meaning but finite. This quantity can be used to find out how useful a given macroscopic quantity would be in characterizing a system. In addition, we can employ a neural network to parametrize the feature function, and optimize it to get the best feature. This high-level representation is learned in a completely unsupervised way. We show examples that involve many-particle systems and fluctuating fields, but the technique has potential applications not only in the most diverse physical scenarios, from statistical physics to dynamical systems or even quantum mechanics, but also as a tool to study neural networks themselves from the information-theoretic perspective.

Presenters

  • Leopoldo Sarra

    Max Planck Inst for Sci Light

Authors

  • Leopoldo Sarra

    Max Planck Inst for Sci Light

  • Andrea Aiello

    Marquardt Division, Max Planck Institute for the Science of Light, Max Planck Inst for Sci Light

  • Florian Marquardt

    Univ Erlangen Nuremberg, Max Planck Inst for Sci Light, Max Planck Institute for the Science of Light