SGD-SINDy: Stochastic Gradient-Descent based Framework for Flexible System Identification

ORAL

Abstract

We propose a novel methodology within the Sparse Identification of Nonlinear Dynamical Systems (SINDy) framework that leverages stochastic gradient descent (SGD) optimization (SGD-SINDy) to enhance the identification of parameters in dynamical systems and reduce the dependency on prior library of candidate terms. Unlike the traditional SINDy method, which often requires prior linear parameter distribution, our framework finds the parameters more efficiently and accurately through a flexible global optimization setting. That is, SGD-SINDy does not require previous knowledge of nonlinear parameters such as frequency in trigonometric functions or bandwidths in exponential functions. Importantly, our approach also alleviates the need for extensive hyperparameter tuning by optimizing hyperparameters simultaneously during the process. We demonstrate the efficacy of our methodology across various dynamical systems, including coupled ordinary differential equations (ODEs) such as harmonic oscillators, Van der Pol oscillators, the chaotic ABC flow, and reaction kinetics. Our results show substantial improvements in parameter identification when nonlinear features exist, highlighting the potential of SGD optimization to advance SINDy-based analyses.

Presenters

  • Amirhossein Arzani

    University of Utah

Authors

  • Amirhossein Arzani

    University of Utah

  • Siva Viknesh

    University of Utah

  • Younes Tatari

    University of Utah