Attention-based Convolutional Recurrent Autoencoder for Learning Wave Propagation
ORAL
Abstract
While forward analysis using hyperbolic partial differential equations is quite successful in modeling wave propagation phenomena, these phenomena pose challenges in dimensionality reduction and inverse modeling. To forecast time-series of wave propagation, we present a novel attention-based convolutional recurrent autoencoder (AB-CRAN) as a reduced-order model based on a domain-specific deep learning algorithm. The proposed AB-CRAN employs a denoising convolutional autoencoder to project the high-dimensional data to a low-dimensional nonlinear manifold and an attention-based sequence-to-sequence long short-term memory network to evolve these low-dimensional representations in time. A hybrid loss function is constructed by combining the autoencoder and propagator contributions into a single loss. In order to learn the optimal weights, a new supervised-unsupervised training strategy is devised. We demonstrate the effectiveness of our model on three benchmark problems: (i) one-dimensional linear convection with periodic boundary conditions, (ii) one-dimensional viscous Burgers' equation with Dirichlet boundary conditions, and (iii) 2D wave propagation in shallow water. On all data sets, AB-CRAN accurately captures the wave amplitude and learns the wave propagation in time.
–
Presenters
-
Indu Kant Deo
University of British Columbia
Authors
-
Indu Kant Deo
University of British Columbia
-
Rajeev K Jaiman
Mechanical Engineering, University of British Columbia, University of British Columbia