Epistemic Uncertainty Quantification of Deep Neural-Network Based Turbulence Closures
ORAL
Abstract
With the increase in data surrounding diverse turbulence phenomena, Deep Neural Network (DNN) based turbulence closures are being developed to increase the fidelity of turbulence models. A common approach is to increasing the fidelity of Reynolds Averaged Navier-Stokes (RANS) simulations, with DNN trained using data from Direct Numerical Simulation (DNS) or high-resolution Large Eddy Simulations (LES). However, the main obstacle to wider adaptation of DNN-based turbulence closures in simulations for critical industrial processes, is their inability to quantify the uncertainty of the DNN's prediction. The unknown uncertainty of the DNN based closure can be quantified using Bayesian statistics, to determine the Epistemic/Model uncertainty of an DNN through Bayesian Inference. This study introduces the idea and background of Bayesian Inference and how it is used to quantify the Epistemic uncertainty of an DNN-based turbulence closures. Different Bayesian Inference approximation methods such as Deep Ensembles, Monte-Carlo Dropout, and Stochastic Variational Inference will be compared, along with their associated uncertainty quantification performance.
–
Presenters
-
Cody Grogan
Utah State University
Authors
-
Cody Grogan
Utah State University
-
Som Dutta
Utah State University
-
Mauricio Tano
Idaho National Laboratory
-
Som Dhulipala
Idaho National Laboratory
-
Izabela Gutowska
Oregon State University