APS Logo

Learning Chaotic Dynamics through DMD and Neural Networks

ORAL

Abstract

Dynamic mode decomposition (DMD) has become a practical model-free time-series analysis and modeling approach due primarily to its ability to provide modal characterizations of complex flows using only linear spectral techniques without recourse to constitutive equations. While modern machine learning methods have been combined with DMD to enhance its descriptive, reconstructive, and predictive accuracy, chaotic time series can still prove challenging to model accurately with DMD, especially for reconstruction and prediction.

We present extensions to DMD using autoencoders coupled with Takens embeddings that allow us to accurately learn and then predict dynamics across a range of chaotic dynamical systems. These include the classic Lorenz-63 system, the multiscale Rossler system, and the Kuramoto-Sivashinsky equations. The success of our approach rests on allowing the autoencoders to embed dynamics in higher dimensional spaces while also making the Takens embeddings adaptive during the training of the autoencoders. Together, we obtain high reconstruction accuracies over test data while also allowing for nontrivial predictive windows. Likewise, we also explore the impact of our autoencoder networks by studying how they change the mutual information shared across different dimensions in the dynamics. Experiments show that the encoding process significantly alters the information between dimensions, helping to explain better the role neural networks play in learning dynamical systems.

Publication: CW Curtis, DJ Alford-Lago, E Bollt, A Tuma. "Machine Learning Enhanced Hankel Dynamic-Mode Decomposition." arXiv preprint arXiv:2303.06289

Presenters

  • Daniel J Alford-Lago

    San Diego State University

Authors

  • Christopher W Curtis

    San Diego State University

  • Daniel J Alford-Lago

    San Diego State University

  • Erik Bollt

    Clarkson University

  • Andrew Tuma

    San Diego State University