Capturing small-scale dynamics of turbulence using deep learning
ORAL
Abstract
Turbulent flows are characterized by a wide range of interacting scales. While the large-scales are flow-dependent, the small-scales features such as the statistics derived from the velocity gradient tensor are known to display various universal properties. Hence, understanding and predicting the velocity gradient dynamics is of paramount importance for both theoretical progress and successful modeling. For modeling purposes, the pressure Hessian and viscous Laplacian result in unclosed terms, and have been the subject of various analytical closure approaches. In this work, we instead use a deep learning framework to model these terms by utilizing the tensor-based neural network (TBNN), which satisfies various symmetries and physical constraints by design. The TBNN is trained using a massive database generated using direct numerical simulations (DNS) of isotropic turbulence in periodic domains, of up to 122883 grid points, with the Taylor-scale Reynolds number Rλ ranging from 140-1300. The resulting model shows good agreement with the DNS data for statistics which are not very sensitive to Rλ. By analyzing the statistics of pressure and viscous terms, we discuss strategies to explicitly incorporate Rλ-dependence in the model to capture the effects of intermittency.
–
Presenters
-
Dhawal Buaria
New York University (NYU)
Authors
-
Dhawal Buaria
New York University (NYU)
-
Katepalli R Sreenivasan
New York Univ NYU, New York University, New York University (NYU)