Optimizing Computationally-Intensive Simulations Using a Biologically-Inspired Acquisition Function and a Fourier Neural Operator Surrogate
POSTER
Abstract
* This work was financially and computationally supported by Lawrence Livermore National Laboratory (LLNL) and the Department of Energy (D.O.E): contract DE-AC52-07NA27344. We hope our study can help fuel future advancements at the lab and elsewhere for the betterment of the worldwide scientific community. IM: LLNL-CONF-854201.
Publication: [1] Andrew W Cook. Artificial fluid properties for large-eddy simulation of compressible turbulent mixing.
Physics of fluids, 19(5), 2007.
[2] Andrew W Cook. Enthalpy diffusion in multicomponent flows. Physics of Fluids, 21(5), 2009.
[3] Swagatam Das and Ponnuthurai Nagaratnam Suganthan. Differential evolution: A survey of the state-of-the-
art. IEEE transactions on evolutionary computation, 15(1):4–31, 2010.
[4] David Gottlieb and Chi-Wang Shu. On the gibbs phenomenon and its resolution. SIAM review, 39(4):
644–668, 1997.
[5] Kurt Hornik, Maxwell Stinchcombe, and Halbert White. Multilayer feedforward networks are universal
approximators. Neural networks, 2(5):359–366, 1989.
[6] Christian Igel, Nikolaus Hansen, and Stefan Roth. Covariance matrix adaptation for multi-objective
optimization. Evolutionary computation, 15(1):1–28, 2007.
[7] Nikola B. Kovachki, Zongyi Li, Burigede Liu, Kamyar Azizzadenesheli, Kaushik Bhattacharya, Andrew M.
Stuart, and Anima Anandkumar. Neural operator: Learning maps between function spaces. CoRR,
abs/2108.08481, 2021.
[8] Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart,
and Anima Anandkumar. Fourier neural operator for parametric partial differential equations. arXiv preprint
arXiv:2010.08895, 2020.
Presenters
-
John P Lins
Lawrence Livermore National Laboratory
Authors
-
John P Lins
Lawrence Livermore National Laboratory
-
Wei Liu
Lawrence Livermore National Laboratory