APS Logo

Unconditional stability of a recurrent circuit implementing divisive normalization

ORAL

Abstract

Stability is crucial for effective training of recurrent neural circuits. Traditional cortical models are notoriously difficult to train due to expansive nonlinearities in the dynamical system, often leading to instability. Conversely, recurrent neural networks (RNNs) excel in tasks involving sequential data but lack biological plausibility and interpretability. In this work, we address these challenges by linking divisive normalization (DN) to the stability of ORGaNICs, a biologically plausible recurrent cortical circuit model that dynamically achieves DN and has been shown to simulate a wide range of neurophysiological phenomena. Using Lyapunov's indirect method, we prove the remarkable property of unconditional local stability for arbitrary-dimensional ORGaNICs when the recurrent weight matrix is the identity matrix. We thus connect ORGaNICs to a system of coupled damped harmonic oscillators, which enables us to derive its energy function, providing a normative principle of what the circuit aims to accomplish. Further, for a generic recurrent weight matrix, we prove the stability of the 2D model and demonstrate empirically that stability holds in higher dimensions. Finally, we show that ORGaNICs can be trained by backpropagation through time (BPTT) without using any specialized techniques. We find that ORGaNICs outperform alternative neurodynamical models on static image classification tasks and perform comparably to LSTMs on sequential tasks.

Presenters

  • Shivang Rawat

    New York University (NYU)

Authors

  • Shivang Rawat

    New York University (NYU)

  • David J Heeger

    New York University

  • Stefano Martiniani

    New York University (NYU)