Harnessing the power of natural language processing techniques for multiscale turbulence simulation
ORAL
Abstract
In this work we leveraged neural machine translation (NMT) methods to enable accelerated multiscale simulations of Burgers turbulence. We employed a sequence-to-sequence (seq2seq) autoencoding mechanism with a long short-term memory (LSTM) integration. Originally employed in natural language processing (NLP), this mechanism enabled implementation of a coarse projective integration (CPI) multiscale scheme by translating between the energy spectrum and velocity field descriptions of the flow field. Using seq2seq, our model creates a many-to-many mapping between these scales. When integrated into the CPI scheme, this mapping forms an effective closure model for the lifting operator, translating coarse-scale information back to the fine scale with a mean squared error (MSE) of 0.005. Compared to a random phase initialized velocity signal (MSE of 0.01), our method exhibits superior precision. The Burgers equation was evolved to statistical stationarity using this model, yielding savings factors of 442 compared to DNS and 3-digit precision. Additionally, a convolutional neural network (CNN) was trained to perform the translation, showing superior performance to the LSTM for certain CPI parameters. This work demonstrates neural networks' potential to accelerate multiscale turbulence simulations.
–
Publication: https://arxiv.org/abs/2305.16564
Presenters
-
Mrigank Dhingra
Virginia Tech
Authors
-
Mrigank Dhingra
Virginia Tech
-
Omer San
Oklahoma State University Stillwater
-
Anne E Staples
Virginia Tech