APS Logo

Understanding the dynamics of reservoir computing in chaotic dynamical systems

POSTER

Abstract

Data-driven prediction of chaotic and multiscale dynamical systems is a daunting task. Recently there has been a lot of interest in predicting chaotic and turbulent flow using deep learning methods which have shown mixed results. However, reservoir computing algorithms such as echo state networks (ESN) have shown promise in capturing this behavior both in short term direct prediction as well as the long-term statistics of chaotic systems better than the state-of-the-art in recurrent neural networks. ESNs are an appealing choice for data driven prediction since unlike deep learning algorithms that train the weights through an expensive backpropagation algorithm, the ESN only trains a layer of output weights with linear regression making it orders of magnitude cheaper to train. The memory of the network comes from updating a state vector in the reservoir, using the previous state and the input to the system. In this work we present a theoretical understanding of reservoir dynamics and how it learns the behavior of chaotic signals and how the signals split up as the ``echoes'' of the system, an analytical expression and bound for the ``dynamical memory'' in the reservoir and seek to understand the mechanism which brings about the effectiveness of ESNs in capturing chaotic dynamics.

Authors

  • Adam Subel

    Rice University

  • Ashesh Chattopadhyay

    Rice University

  • Pedram Hassanzadeh

    Rice University