Gated recurrent neural networks 2: a novel first-order chaotic transition
ORAL
Abstract
We study the transition to chaos in gated recurrent neural networks (RNNs). Gating refers to a multiplicative interaction that can modulate the coupling strength between neurons. Such interactions are found in real biological neurons as well as the best performing architectures in machine learning, but their dynamical consequences are not well understood and poorly characterized. We show a striking consequence of gating is the emergence of a first-order transition to chaos, in which the maximal Lyapunov exponent exhibits a discontinuous jump from negative (stable) to positive (chaotic). Furthermore, we observe the decoupling of the topological trivialization transition from the transition to chaos, finding that saddle-points in the dynamics can emerge and proliferate well before chaos emerges. This is in contrast to chaotic transitions in RNNs with additive interactions. Finally, we discuss the consequences such a discontinuous transition might have for machine learning practice.
–
Presenters
-
Tankut Can
Graduate Center, CUNY, Initiative for the Theoretical Sciences, The Graduate Center, CUNY, USA, Initiative for the Theoretical Sciences, The Graduate Center, CUNY
Authors
-
Tankut Can
Graduate Center, CUNY, Initiative for the Theoretical Sciences, The Graduate Center, CUNY, USA, Initiative for the Theoretical Sciences, The Graduate Center, CUNY
-
Kamesh Krishnamurthy
Princeton University, Dept. of Physics and Princeton Neuroscience Institute, Princeton University
-
David J Schwab
Graduate Center, CUNY, Initiative for the Theoretical Sciences, The Graduate Center, CUNY