Forecast saturation and latent dynamics in reservoir computing
POSTER
Abstract
Reservoir computing offers a lightweight and fast framework for forecasting chaotic dynamics, making it attractive for data-driven modeling of complex systems in fluids. In this work, we investigate how the forecast performance of an Echo State Network (ESN) varies with training data length, focusing on understanding data efficiency and uncovering signatures of forecast horizon saturation. Using a fixed reservoir architecture and hyperparameters, we train ESNs on trajectories from chaotic systems such as the Lorenz and R\"ossler attractors to Kuramoto–Sivashinsky equations, systematically increasing the training dataset in discrete steps. We observe that forecast performance improves in a cascade-like manner and can saturate at finite data lengths, even in highly nonlinear systems. Transitions between these performance plateaus consistently coincide with localized bursts in training error, suggesting that such fluctuations—readily computed during training—can serve as signals for forecast readiness. Additionally, we identify HAVOK-like localized spikes in reservoir dynamics, potentially corresponding to transient instabilities or lobe-switching events in the training data. These findings offer insight into how reservoir networks learn and embed chaotic dynamics, while also suggesting practical heuristics for real-time monitoring of training to identify saturation and ensure data efficiency.
Presenters
-
Zhiwei D Li
University of Chicago
Authors
-
Zhiwei D Li
University of Chicago
-
Anastasia Bizyaeva
Cornell University
-
Kartik Krishna
University of Washington
-
Dima Tretiak
University of Washington
-
Steven L Brunton
University of Washington