Exploring the Memory Capacity of Embedded Synfire Chains
POSTER
Abstract
Memory is a long-studied property of neural systems that is not well understood yet essential for the success of any animal. The synfire chain is a popular sequence generating model which has been hypothesized to represent a building block for neural computation and memory. Our research was conducted to explore what affects the storage capacity of memory networks composed of embedded synfire chains. Using computational modeling, we compared the memory capacity of two neuron models – Leaky Integrate-and-Fire (LIF) and Izhikevich – and systematically varied factors that showed to affect each network. In both cases, we used a simplified model of global inhibition to control run-away excitation. Our studies showed that the memory capacity for networks consisting of LIF neurons depends strongly on the inhibition, excitation, and width of the chains. The observed memory capacity was low and insufficient for practical applications. The memory capacity for networks consisting of Izhikevich neurons was considerably larger. More research needs to be done on this. Future directions would also explore more targeted models for inhibition in hope for achieving larger capacity for memory networks.
Presenters
-
Sarah Greberman
Department of Physics, Pennsylvania State University
Authors
-
Sarah Greberman
Department of Physics, Pennsylvania State University
-
Yevhen Tupikov
Department of Physics, Pennsylvania State University
-
Dezhe Jin
Department of Physics, Pennsylvania State University