Dynamics of Neural Networks with Nonlinear Synaptic Interactions
ORAL
Abstract
Models of neurons often assume that presynaptic inputs to the dendrites are linearly summed and passed through a nonlinearity, which is then taken as the output. In reality, neurons are more dynamic processors of their input, adapting and responding to input in a state-dependent manner. Such nonlinear dendritic integration suggests the possibility that an effective description of a neuron should include higher-order, multiplicative interactions of inputs. With this motivation, we take an approach to developing an effective model of realistic neuronal dynamics that is inspired by the effective theories of physics. Namely, we take as our starting point a model in which all polynomial interactions that are not explicitly forbidden are allowed, and study the consequences of these interactions on the resulting dynamical behavior of the system. We develop this approach using a solvable discrete-time recurrent neural network which is a straightforward generalization of well-studied models of recurrent neural networks to include higher-order interactions. Higher-order dendritic interactions induce tensor coupling between neuronal activity and external input, which we find results in some novel phenomenapeculiar properties. For completely random couplings, we solve this model in the mean-field limit and obtain a phase diagram describing the transition to chaos. We find that higher-order interactions generically lead to a discontinuous transition to chaos as a function of the interaction strengths. Furthermore, external input in these networks tends to destabilize the dynamical traces, in stark contrast to behavior typically observed in models with linear dendritic summation. We also explore the complexity of the phase space of these models, and show a striking decoupling between the phase space complexity, as measured by the growth rate of the number of critical points, and the dynamical complexity, as measured by the maximal Lyapunov exponent. Finally, we draw connections to other models of multiplicative interactions, such as gated RNNs, neural ordinary differential equations, and dense associative memory. Overall, this work opens up an interesting direction to study the emergent properties of more realistic neural network models and the dynamical regimes they operate in.
–
Presenters
-
Tankut U Can
Emory University
Authors
-
Tankut U Can
Emory University
-
Kamesh Krishnamurthy
Zyphra