Coarse scale representation of spiking neural networks: from dynamics to backpropagation through spikes
ORAL
Abstract
Leaky integrate and fire neurons have long been used as a model system to understand the dynamics of spiking neural networks, recently becoming the underlying model of neuromorphic chips such as Intel's Loihi. One of the interesting features of this type of models is the presence of an absolute refractory period, which essentially limits the maximum spike rate that can be attained by the system. This is also the largest time interval that guarantees that at most a single spike is produced per neuron. In this work we have explored the development of coarse scale representations of leaky integrate and fire neurons that operate at this timescale. Our coarse scale approximation is obtained by approximating spike arrival times as being homogeneously distributed over our time interval, and results on a discrete representation that exhibits equivalent dynamics on randomly connected networks. Moreover, the coarse scale model allows us to implement stochastic gradient descent methods for spiking neurons that take advantage of backpropagation. This provides a useful baseline with which to compare more bio-inspired approaches based on local learning rules, as well as the impact of different codings on the network's ability to learn and generalize.
–
Presenters
-
Angel Yanguas-Gil
Argonne National Laboratory
Authors
-
Angel Yanguas-Gil
Argonne National Laboratory