APS Logo

Accelerating Model Selection with Relative Loss Topologies

POSTER

Abstract

Current model selection methods are often inefficient in high-dimensional or nonlinear systems, where evaluating each candidate model is computationally prohibitive. Traditional approaches frequently involve redundant recalculations and struggle to navigate complex loss landscapes, especially in systems with intricate minima or highly variable functional behavior. This work introduces a framework that reduces computational load by leveraging precomputed loss relationships between a small set of basis functions and a larger set of candidate models, enabling efficient model selection without evaluating each candidate directly. By analyzing the relative loss topologies of the basis set, the framework can more effectively leverage known system data. The initial investment in precomputing these basis-candidate relationships is counterbalanced by substantial information gains during iterative techniques, making this approach particularly valuable for machine learning and inverse problems, where efficiently exploring complex function spaces is essential to reducing redundant calculations and accelerating convergence.

Presenters

  • Jack Dienhart

    (Niels Bohr Institute) (Lawrence Berkeley National Laboratory)

Authors

  • Jack Dienhart

    (Niels Bohr Institute) (Lawrence Berkeley National Laboratory)