APS Logo

Parallel evolutional deep neural networks for compressible Navier-Stokes

ORAL

Abstract

Evolutional deep neural networks (EDNN) were introduced as predictive solvers for nonlinear partial differential equations (Y. Du & T. A. Zaki, Phys. Rev. E 104, 045303, 2021). Starting from any initial condition projected onto the network state, the EDNN parameters are evolved by solving the governing dynamical equations, and without the need for costly training. Several advancements will be discussed: First, multiple EDNNs are utilized simultaneously to predict the evolution of a vector valued state that is governed by coupled nonlinear partial differential equations. Second, the physical domain is decomposed into sub-regions that are solved by separate EDNNs. Boundary information between networks is exchanged to ensure global accuracy of the solution, using a novel EDNN boundary function (EBF). The resulting partition enables parallel execution and, additionally, enables efficient use of multiple smaller networks without sacrificing accuracy. The total number of network parameters necessary to predict the entire state-space evolution is reduced, which lowers the memory footprint and computational expense. The compressible Navier-Stokes equations are solved to demonstrate the accuracy of this multi-EDNN framework.

Presenters

  • Hadden Kim

    Johns Hopkins University

Authors

  • Hadden Kim

    Johns Hopkins University

  • Tamer A Zaki

    Johns Hopkins University