Scalable Solutions for Training Machine Learned Interatomic Potentials
ORAL
Abstract
The promise of all machine learning (ML) methods is that model accuracy can in principle be improved indefinitely so long as new training data is provided. This keeps model predictions as interpolations within the trained space rather than relying on uncertain predictions arising from extrapolations. For machine learned interatomic potentials used in molecular dynamics, there is no way to know a priori all the states of the material that will be observed in a large-scale production simulation. Automated training data curation either in real-time or diversity maximizing techniques are sought after to alleviate these concerns, though assembled training sets now scale with the size of the computing resources used. This talk will overview the user friendly FitSNAP code and its integration into the Exascale Computing Project EXAALT software stack with a focus on the challenges and advances made to tackle exascale sized training sets needed to construct robust and truly transferable interatomic potentials.
–
Presenters
-
Mitchell A Wood
Sandia National Laboratories
Authors
-
Mitchell A Wood
Sandia National Laboratories
-
Charles A Sievers
University of California, Davis
-
Danny Perez
Los Alamos Natl Lab
-
Nick Lubbers
Los Alamos National Laboratory, Los Alamos Natl Lab
-
Aidan P Thompson
Sandia National Laboratories