Attention-enhanced PDE-preserved Neural Network for Predicting Spatiotemporal Physics
ORAL
Abstract
Modeling complex spatiotemporal dynamics plays an essential role in predicting, understanding, and controlling physical processes. However, traditional numerical methods are prohibitively expensive to conduct in many-query tasks (e.g., design optimization). Although data-driven models based on deep learning have shown extraordinary capabilities in learning complicated dynamics, issues like high training costs, error accumulation, and poor generalizability limit their applications to real-world problems. A promising way is to combine the advantages of physics models and deep learning, known as physics-informed deep learning (PiDL). One direction in this regard is to preserve the mathematical structure of the governing physics in the deep learning architecture, i.e., PDE-preserved neural network (PPNN). In this work, we extend the PPNN structure by leveraging the attention mechanism in both time and space to learn a more accurate representation of the system. A more efficient multi-step time integration scheme could be learned using temporal attention, alleviating the error accumulation even further than the original PPNN. The merit of attention-enhanced PPNN is demonstrated over a group of complex spatiotemporal systems governed by PDEs, including Navier-Stokes equations.
–
Presenters
-
Xin-yang Liu
University of Notre Dame
Authors
-
Xin-yang Liu
University of Notre Dame
-
Jian-Xun Wang
University of Notre Dame