Differentiable physics for generalizable closure modeling of separated flows

ORAL

Abstract

The computational modeling of turbulent flows is challenging due to the high computational costs of resolving all spatio-temporal scales. Machine learning (ML) methods have been proposed for constructing turbulence closures which can alleviate these costs by modeling the effects of unresolved structures on resolved quantities. However, several ML-based turbulence models show weak generalization capabilities under varying geometries and their closure requirements. This talk will present results from a differentiable programming framework to learn generalizable closure models. Specifically, our framework involves the training of a graph-neural network (GNN) model for subgrid stresses (SGS), embedded within a finite element (FEM) solver. This is achieved by chaining gradients computed by automatic differentiation for GNNs with the discrete adjoint of the FEM solver and enables the learning of SGS given a fully-resolved flow field. In this research, we leverage the mesh invariant property of GNNs to learn subgrid models for separated flows from different separation physics (i.e., smooth, sharp and for various geometries). Our formulation enables a single GNN-based subgrid closure model that generalizes across different geometries as well as separation phenomena and supports the conclusion that generalizable ML closures may be constructed using the differentiable physics.

Publication: Shankar, Varun, Romit Maulik, and Venkatasubramanian Viswanathan. "Differentiable turbulence ii." arXiv preprint arXiv:2307.13533 (2023).

Presenters

  • Hojin Kim

    Pennsylvania State University

Authors

  • Hojin Kim

    Pennsylvania State University

  • Romit Maulik

    Pennsylvania State University