On matching symmetries and information between training time series and machine dynamics.
ORAL
Abstract
Recurrent networks and some deep feed-forward networks in machine learning effectively construct a very high-dimensional dynamical system that classifies objects through its asymptotic dynamics. Many training inputs of the same class are used to construct a machine with similar trajectories flowing to the same fixed point attractor. The attractor itself carries no information/entropy. A specific example in reservoir computing is to train a single-layer machine using a trajectory of a known chaotic dynamical system and construct a linear projection from machine variables back to the chaotic system that reproduces the training chaotic trajectory (validation) and "predicts" a bit into the future (testing). Our perspective is to go beyond validation/testing of a particular trajectory and analyze the symmetry and information in the general asymptotic dynamics of the trained machine. For well trained machines we can get the information and symmetries of the machine dynamics to approximately match that of the training dynamical system. The machine's dynamics then has its own strange attractor and the machine can generate new time series of the same class, i.e. that are different but equivalent to the training data.
–
Presenters
-
Jan Engelbrecht
Boston College
Authors
-
Jan Engelbrecht
Boston College
-
Owen Tong Yang
Boston College
-
Renato Mirollo
Boston College