Data-driven subgrid-scale parameterization of turbulence in the small-data limit
ORAL
Abstract
In this work, we develop a data-driven subgrid-scale(SGS) model for large eddy simulation of turbulence using a fully convolutional neural network(CNN). We first conduct direct numerical simulation (DNS) and obtain training, validation, and testing data sets by applying a Gaussian spatial filter to the DNS solution. We train the CNN with the filtered state variables, i.e., vorticity and stream function as inputs and the nonlinear SGS term as an output. A priori analysis shows that the CNN-predicted SGS term accurately captures the inter-scale energy transfer. A posteriori analysis indicates that the LES-CNN outperforms the physics-based models in both short-term prediction and long-term statistics. In the small-data limit, the LES-CNN generates artificial instabilities and thus leads to unphysical results. We propose three remedies for the CNN to work in the small-data limit, i.e., data augmentation and group convolution neural network, leveraging the rotational equivariance of the SGS termand incorporating a physical constraint on the SGS enstrophy transfer. The SGS term is both translational and rotational equivariant in a square periodic flow field. While primitive CNN can capture the translational equivariance, the rotational equivariance can be accounted for by either including rotated snapshots in the training data set or by a GCNN that enforces rotational equivarianceas a hard constraint. Additionally, The SGS enstrophy transfer constraint can be implemented in the loss function of the CNN. A priori and a posteriori analyses show that the CNN/GCNN with knowledge/constraints of rotational equivariance and SGS enstrophy transfer enhances the SGS model and allows the data-driven model to work stably and accurately in a small-data limit.
–
Publication: [1] Subel, Adam, Ashesh Chattopadhyay, Yifei Guan, and Pedram Hassanzadeh. "Data-driven subgrid-scale modeling of forced Burgers turbulence using deep learning with generalization to higher Reynolds numbers via transfer learning." Physics of Fluids 33, no. 3 (2021): 031702.<br>[2] Guan, Yifei, Ashesh Chattopadhyay, Adam Subel, and Pedram Hassanzadeh. "Stable a posteriori LES of 2D turbulence using convolutional neural networks: Backscattering analysis and generalization to higher Re via transfer learning." arXiv preprint arXiv:2102.11400 (2021).<br>[3] Guan, Yifei, Adam Subel, Ashesh Chattopadhyay, and Pedram Hassanzadeh. "Developing data-driven subgrid-scale models for stable LES in the small-data limit: Applications of physics-constrained convolutional neural networks and data augmentation" (in preparation).