Dynamics of long-term memory in recurrent neural networks
ORAL
Abstract
Recent development in artificial neural networks has opened many possibilities for developing long-term memory devices. The dynamics of the memory retrieval process have been poorly understood, with open issues such as how different memory states compete and how the desired memory state can be recalled. We study memory devices based on reservoir computing, a general class of recurrent neural networks, under two distinct settings: with or without an explicit index/address channel, corresponding to the "location-addressable" and "context-addressable" scenarios, respectively. We demonstrate that, for the location-addressable scenario, a single reservoir computer can restore more than a dozen sophisticated memory states, such as chaotic attractors, which are sustained and can be successfully recalled. The dynamics of the memory are studied with a focus on the transition success rates when switching among different memory states. Control strategies to enhance the success rates are articulated. For the "context-addressable" setting without an index channel, we exploit multistability to recall with the aid of some cue signals. These different memory states can be coexisting asymptotic attractors or transient states. A surprising transition phenomenon in the retrieval success rate emerges as the length of the cue signal varies. The dynamical behaviors associated with memory retrieval uncovered in this work provide foundational insights into developing long-term memory devices based on artificial neural networks.
–
Publication: Dynamics of long-term memory in recurrent neural networks. Ling-Wei Kong, Junjie Jiang, and Ying-Cheng Lai. (In Preparation)
Presenters
-
Ling-Wei Kong
Arizona State University
Authors
-
Ling-Wei Kong
Arizona State University
-
Junjie Jiang
Xi'an Jiaotong University
-
Ying-Cheng Lai
Arizona State University