APS Logo

How network structure shapes dynamics and learning in recurrent neural networks

POSTER

Abstract

Recent work is yielding connectivity data in a diversity of neural systems. However, it is an open problem how connectivity statistics shape network activity and trainability. In this work, we study the effects of partially symmetric and antisymmetric weight matrices, weight variance, and self-coupling on the dynamics and trainability of continuous-time recurrent neural networks (RNNs). We calculate the full Lyapunov spectrum, yielding estimates of attractor dimension and entropy rate. In networks with small weight variance, partial symmetry increases dimension and entropy rate. However, partial antisymmetry increases dimension and entropy rate in large weight variance networks because of a decreased fraction of nonlinear units in saturation. In networks with self-coupling, dimension and entropy rate increase with antisymmetry regardless of weight variance. To study the implications of connectivity in learning, we investigate how initial (anti)symmetry affects the training of RNNs with backpropagation to generate limit cycles and integrate multidimensional input. Partial antisymmetry leads to better final performance and faster learning in both tasks. Our work on RNN structure may provide insights on how connectivity shapes dynamics, learning, and function in biological networks.

Presenters

  • Matthew Ding

    Columbia University, Zuckerman Institute

Authors

  • Matthew Ding

    Columbia University, Zuckerman Institute

  • Rainer Engelken

    Columbia University, Zuckerman Institute