APS Logo

Perturbation Theory for the Information Bottleneck

ORAL

Abstract

Extracting relevant information from data is crucial for all forms of learning. The information bottleneck (IB) method formalizes this, offering a mathematically precise and conceptually appealing framework for understanding learning phenomena. However the nonlinearity of the IB problem makes it computationally expensive and analytically intractable in general. Here we derive a perturbation theory for the IB method and report new analytical results for the learning onset - the limit of maximum relevant information per each bit, extracted from data. We test our results on synthetic probability distributions, finding good agreement with the exact numerical solution near the onset of learning. Our work also provides a fresh perspective on the intimate relationship between the IB method and the strong data processing inequality.

Presenters

  • Vudtiwat Ngampruetikorn

    The Graduate Center, City University of New York

Authors

  • Vudtiwat Ngampruetikorn

    The Graduate Center, City University of New York

  • David J. Schwab

    City University of New York Graduate Center, The Graduate Center, City University of New York