Latent Diffusion Models for Partial Differential Equations Modeling
ORAL
Abstract
Recent advances in deep learning have inspired numerous works on data-driven solutions to fluids problems. These data-driven PDE solvers can often be much faster than their numerical counterparts; however, they present unique limitations and must balance training cost, numerical accuracy, and ease of applicability to different problem setups. To address these limitations, we frame solving time-dependent PDEs as a generative problem and apply latent diffusion models to sample PDE solution trajectories from a learned conditional distribution. In particular, we investigate conditioning on an initial solution as well as conditioning solely on a text prompt, paving the way for more usable and accessible physics solvers. Additionally, we leverage a learned latent space to accelerate training by predicting physics in this reduced space. Through experiments on regular and mesh-based physics problems, we show that this approach is competitive with current data-driven PDE solvers. By extending latent diffusion models to PDE problems, we hope that insights from prior work on generative modeling and diffusion can inspire powerful, fast, and widely applicable physics solvers.
–
Presenters
-
Anthony Zhou
Carnegie Mellon University
Authors
-
Anthony Zhou
Carnegie Mellon University
-
Amir Barati farimani
Carnegie Mellon University