Interpretable Diffusion Models for Turbulent Flows via Information-Theoretic Interventions
ORAL
Abstract
Over the past century, decomposition techniques have provided powerful low‐order representations of turbulent flows, enabling physical insight, computational efficiency, and system prediction. However, these methods struggle to capture strongly nonlinear interactions present in turbulent and transitional regimes. Recently, machine‐learning models have shown great promise in learning compact, nonlinear embeddings directly from raw velocity fields, which enables them to achieve impressive reconstruction and short‐term prediction accuracy. Unfortunately, these models often act as "black boxes," obscuring the mechanisms that underlie their forecasts. The lack of interpretability, required for robustness and safety, is a critical barrier for fluid-mechanics applications, thus causing a shift towards interpretability that has resulted in novel architecture for networks in all applications. Beginning with information theory analysis post-learning, interpretable networks have seen improvements in learning through the inclusion of codebooks, interventional layers, and information measures. We propose enhancing Brownian Bridge diffusion (BBD) with interventional branches and entropy-based measures to improve both performance and transparency in modeling complex fluid flows.
–
Presenters
-
Jackson P R Torok
Florida State University
Authors
-
Jackson P R Torok
Florida State University
-
Huixuan Wu
Florida State University