APS Logo

Quitting when you're ahead, but how do you know when you're ahead?

ORAL · Invited

Abstract

Since filing our doctoral theses, all us academics know the feeling of wanting to make a piece of work perfect, and then realizing there’s not enough time. As we progress in our careers, we may eventually opine that perfect is always the enemy of the good. But this lofty principle does not help us figure out how, why, or when our level of certainty in our academic output is good enough to declare, with quantified uncertainty, what we think our results are.



Some framings of this vague question can find answers in squarely principled ways. In this presentation, we will explore ways in which simple principles of Bayesian statistics and applied probability modeling can help us decide when statistical modeling of uncertainty quantification is insufficient, when it feels like it’s enough, and when it’s gone too far. We will mention an application to quantifying uncertainty for the nuclear saturation point which uses one type of methodological principle where elementary model mixing may be superior to finer-tuned multi-model assimilation. We will then delve briefly into the broader principles of honesty, efficiency, and predictive power in Bayesian statistics, illustrated with applications to agro-ecology which we hope can find a translation into nuclear physics. And if you have never heard of agro-ecology, this is a chance to find out a bit about it.

Presenters

  • Frederi Viens

    William Marsh Rice University

Authors

  • Frederi Viens

    William Marsh Rice University