Fourier-Embedded Deep Operator Networks for High-Fidelity Parametric PDE Surrogates and Inverse Optimization
ORAL
Abstract
Operator learning frameworks aim to approximate mappings between infinite-dimensional function spaces, providing data-driven surrogates for solving families of partial differential equations (PDEs). Among them, Deep Operator Networks (DeepONets) are particularly well-suited for modeling parametric PDEs across varying inputs and discretizations. In this work, we introduce Fourier-Embedded DeepONet (FEDONet), a spectral-enhanced operator network designed for high-fidelity surrogate modeling and inverse optimization in PDE-governed systems. Many control and source-recovery rely on accurately capturing fine-scale features—such as sharp gradients, localized sources, and multiscale patterns—that standard neural architectures often under-resolve due to low-frequency bias. To overcome this, we embed randomized Fourier features into the trunk network, enriching spectral representation and improving resolution of high-frequency dynamics. FEDONet consistently outperforms standard DeepONets, achieving 2–3× reductions in relative L² error and superior preservation of spectral content, especially in high-gradient and chaotic regimes. We demonstrate its versatility on a suite of parametric PDE benchmarks including Poisson, Burgers, Allen–Cahn, Lorenz-63, and Kuramoto–Sivashinsky equations. Additionally, we integrate FEDONet into PDE-constrained optimization loops, enabling efficient inverse design and real-time control without repeated numerical solves.
These results position FEDONet as a scalable, operator learning framework for digital twins, optimal control, and scientific machine learning.
–
Presenters
-
Arth Sojitra
University of Tennessee, Knoxville
Authors
-
Arth Sojitra
University of Tennessee, Knoxville
-
Mrigank Dhingra
University of Tennessee Knoxville
-
Omer San
University of Tennessee