The Thermodynamic Costs of Running Hopfield Networks through Sequences of Memories
ORAL
Abstract
The energy expenditure and heat generated by running modern neural networks is prohibitively large, especially in comparison with networks found in nature. As such, understanding the thermodynamics of running such systems is of great interest, both for designing artificial neural networks and for understanding the organization of biological ones. We tackle this problem in the context of stochastic Hopfield networks, which are abstract models of associative memory. While the equilibrium properties of Hopfield networks (and dense associative networks more generally) are well understood, these systems have yet to be explored out of equilibrium. To address this issue, here we focus on the entropy production and associated energy costs generated when Hopfield networks are driven through sequences of stored memories. Using the tools of stochastic thermodynamics, we find how the energy requirements and the entropy production in these networks depend on the number of stored memories, the network architecture, and the amount of work performed on the system during state transitions. Additionally, we use thermodynamic speed limit theorems to bound state transition times as a function of these system parameters.
–
Presenters
-
Spencer Rooke
University of Pennsylvania
Authors
-
Spencer Rooke
University of Pennsylvania
-
Qingyue Wu
University of Pennsylvania
-
David H Wolpert
Santa Fe Institute, Santa Fe Institutue
-
Vijay Balasubramanian
University of Pennsylvania