APS Logo

Geometry of activity in random networks under external inputs

ORAL

Abstract

Understanding the geometry of neural activity is thought to be relevant for analysis of cognitive processes, such as decision making. Recently, it has been shown that, for a minimally structured random recurrent neural network in the chaotic regime, the dimensionality of the chaotic spontaneous activity is extensive, i.e., it grows linearly with the number of neurons. Since regions in the brain usually receive external non-recurrent inputs from other regions or from external stimuli, we investigate how the geometry of activity in such random recurrent networks is affected by static external inputs. For this, we use dynamical mean-field theory and numerical simulations. We show that the dimensionality of the temporal chaos remains extensive and peaks at a nonzero external input strength. To investigate the network's diversity of responses to external inputs, we characterize the dimensionality of the time-averaged response to an ensemble of external inputs. We observe that this dimensionality increases approximately linearly with the external input strength within the chaotic regime. This suggests a trade-off between richer (i.e., higher-dimensional) temporal chaos at lower external input strengths and a greater diversity of time-averaged responses to external inputs at larger input strengths.

Presenters

  • Zehui Zhao

    Emory University

Authors

  • Zehui Zhao

    Emory University

  • Michael Jeremy Pasek

    Emory University

  • Ilya M Nemenman

    Emory University