Associative Memory of Knowledge Structures and Sequences
ORAL · Invited
Abstract
A long standing challenge in biological and artificial intelligence is to understand how new knowledge can be constructed from known building blocks in a way that is amenable for computation by neuronal circuits. Here we focus on the task of storage and recall of structured knowledge in long term memory. Specifically, we ask how recurrent neuronal networks can store and retrieve multiple knowledge structures. We model structures as a set of binary relations between events and cues (cues may represent e.g., temporal order, spatial location, role in semantic structure). We use binarized holographic reduced representation (HRR) to map such structures to distributed neuronal activity patterns. We then use associative memory plasticity rules to store these activity patterns as fixed points in the recurrent network. By a combination of signal-to-noise analysis and numerical simulations we demonstrate that our model allows for an efficient storage of these knowledge structures, such that the memorized structures as well as their individual building blocks (e.g., events and cues) can be subsequently retrieved, from partial retrieving cues. We show that long-term memory of structured knowledge relies on a new principle of computation beyond the memory basins. Finally, we show that our model can be extended to store sequences as single attractors.
–
Publication: Julia Steinberg and Haim Sompolinsky. Associative memory of structured knowledge. In preparation
Presenters
-
Julia A Steinberg
Princeton University
Authors
-
Julia A Steinberg
Princeton University
-
Haim I Sompolinsky
The Hebrew University of Jerusalem and Harvard University, Hebrew University of Jerusalem, Center for Brain Science, Harvard Univer