Input spike trains suppress chaos in balanced neural circuits

ORAL

Abstract

A longstanding hypothesis claims that structured input in neural circuits enhances reliability of spiking responses. While studies in single neurons well support this hypothesis [Mainen, Sejnowski 1995] the impact of input structure on the dynamics of recurrent networks is not well understood. Earlier studies of the dynamic stability of the balanced state used a constant external input [van Vreeswijk, Sompolinsky 1996, Monteforte, Wolf 2010] or white noise [Lajoie et al. 2014]. We generalize the analysis of dynamical stability for balanced networks driven by input spike trains. An analytical expression for the Jacobian enables us to calculate the full Lyapunov spectrum. We solved the dynamics in numerically exact event-based simulations and calculated Lyapunov spectra, entropy production rate and attractor dimension. We examined the transition from constant to stochastic input in various scenarios. We find a suppression of chaos by input spike trains. We also find that both independent bursty input spike trains and common input more strongly reduces chaos in spiking networks. Our study extends studies of chaotic rate models [Molgedey et al. 1992] to spiking neuron models and opens a novel avenue to study the role of sensory streams in shaping the dynamics of large networks.

Authors

  • Rainer Engelken

    Max Planck Institute for Dynamics and Self-Organization

  • Michael Monteforte

    Max Planck Institute for Dynamics and Self-Organization

  • Fred Wolf

    Max Planck Institute for Dynamics and Self-Organization, Max Planck Institute for Dynamics and Self-organization