APS Logo

Information and Optimal Inference

ORAL

Abstract

Physics works because it ignores microscopic details, not because they are small, but because they are unimportant. This idea of importance has a natural interpretation in information theory, and in earlier work we showed that omitting detail isn’t a trade-off: A model which includes many irrelevant parameters captures much less information than a simpler model without them, when both are fitted to the same limited noisy data [1]. We also showed that a Bayesian prior which is unaware that some parameters are irrelevant induces large bias, which is avoided by the simpler model [2]. Here we present recent results on inferring the parameters of the optimally simple model directly from data, with knowledge of the experiment and its noise, but without the need to first construct a prior.

[1] Mattingly, Transtrum, Abbott, Machta “Rational Ignorance” PNAS 115 (2018) 1760

[2] Abbott & Machta, “Far from Asymptopia” (2022) arXiv:2205.03343

Presenters

  • Michael C Abbott

    Yale University

Authors

  • Michael C Abbott

    Yale University

  • Julian A Rubinfien

    Yale, Yale University

  • Benjamin B Machta

    Physics, Qbio Institute, Yale University, Yale University