Learning-based predictive models: a new approach to integrating large-scale simulations and experiments
ORAL
Abstract
We will describe a large research effort at Lawrence Livermore National Laboratory aimed at using recent advances in deep learning, computational workflows, and computer architectures to develop an improved predictive model – the learned predictive model. Our goal is to first train these new models, typically cyclic generative adversarial networks, on simulation data to capture the theory implemented in advanced simulation codes. Later, we improve, or elevate, the trained models by incorporating experimental data. We will present work using inertial confinement fusion as a testbed for development. We will describe advances in machine learning architectures and methods necessary to handle the challenges of ICF science, including rich, multimodal data and strong nonlinearities. We will also cover state-of-the-art tools that we developed to manage our computational workflow. The tools manage a wide range of tasks, including developing enormous simulated training data sets, driving the training of learned models on simulation data, and elevating learned models through exposure to experiment. We will end by drawing ties to other scientific applications both within LLNL and in the broader computation and science community.
–
Presenters
-
Brian K. Spears
Lawrence Livermore Natl Lab
Authors
-
Brian K. Spears
Lawrence Livermore Natl Lab
-
Jayson Dean Lucius Peterson
Lawrence Livermore Natl Lab, Lawrence Livermore National Laboratory
-
Timo Bremer
Lawrence Livermore Natl Lab
-
Brian Van Essen
Lawrence Livermore Natl Lab
-
John E Field
Lawrence Livermore Natl Lab
-
Peter Robinson
Lawrence Livermore Natl Lab
-
Jessica Semler
Lawrence Livermore Natl Lab
-
Bogdan Kustowski
Lawrence Livermore Natl Lab
-
Jim A Gaffney
Lawrence Livermore Natl Lab