Impact of correlated connections in large recurrent networks with mesoscopic structure
ORAL
Abstract
Random recurrent networks serve as a useful tool for the tractable analysis of large neural networks. The spectrum of the connectivity matrix determines the network’s linear dynamics as well as the stability of the nonlinear dynamics. Knowledge of the onset of chaos helps determine the networks computational capabilities and memory capacity. However, fully homogeneous random networks lack the non-trivial structures found in real world networks, such as cell types and plasticity induced correlations in neural networks. We address this deficiency by investigating the impact of correlations between forward and reverse connections, which may depend on the neuronal type. Using random matrix theory, we derive a set of self consistent equations that efficiently compute the eigenvalue spectrum of large random matrices with block-structured correlations. The inclusion of structured correlations distorts the eigenvalue distribution in a nontrivial way; the distribution is neither a circle nor an ellipse. We find that layered networks with strong interlayer correlations have gapped spectra. For antisymmetric layered networks, oscillatory modes dominate the linear dynamics. In simple cases we find analytic expressions for the support of the eigenvalue distribution.
–
Presenters
-
Alexander Kuczala
Salk Inst
Authors
-
Alexander Kuczala
Salk Inst
-
Tatyana Olegivna Sharpee
Salk Inst, Salk Institute for Biological Studies, Salk Institute