Error control in neural network-coupled binary trees for complex manifold-based combustion models
ORAL
Abstract
Manifold-based combustion models decrease the cost of turbulent combustion simulations by projecting the thermochemical state onto a lower-dimensional manifold, allowing the thermochemical state to be computed separately from the flow solver. Solutions for the manifold equations have traditionally been precomputed and pretabulated, resulting in large memory requirements and significant precomputation cost, even for simple problems. In-Situ Adaptive Manifolds (ISAM) enables solutions to the manifold equations to be computed as the simulation progresses and stored using binary trees with In-Situ Adaptive Tabulation (ISAT), allowing for the use of more general manifold models. While ISAT helps reduce the memory requirements compared to pretabulation approaches, as the model complexity grows, the memory requirements of ISAT databases will still eventually become too large. Previous work has shown ISAT memory requirements can be reduced in LES of canonical turbulent flames by pruning and replacing portions of the binary trees with neural networks. After pruning, there is no way to guarantee the accuracy of the predictions from the neural networks, which is inherently controlled in normal ISAT operation. This work evaluates two methods for introducing error control into the NNBT approach. The first method is to include Monte Carlo dropout layers in the neural network structure to provide estimates of prediction uncertainty after training; the second method is to fit a Gaussian Process Regressor to the ISAT database to identify regions that have sufficient or require more data prior to training. Both methods will be evaluated in terms of their computational efficiency and their ability to accurately estimate or reduce error in the manifold solutions.
–
Presenters
-
Stephen Trevor Fush
Princeton University
Authors
-
Stephen Trevor Fush
Princeton University
-
Israel J Bonilla
Princeton University
-
Michael E Mueller
Princeton University