The Effects of Physicality on Emergent Learning
ORAL · Invited
Abstract
Recently, a class of self-adjusting analog electronic systems, Contrastive Local Learning Networks (CLLNs) have been experimentally realized1-2. These systems learn in response to imposed boundary conditions (datapoints), using physical quantities as adjustable parameters. They evolve into a static structure that solves the desired task, using physical dynamics (not digital logic) for both learning and evaluation. This makes CLLNs similar to materials and biological systems in ways digital methods like artificial neural networks (ANNs) are not.
Here, we investigate the direct effects of this physicality. We find that forgetting is an unavoidable consequence of imperfect adaptation. As a result, learning two tasks creates limit cycles in the learning degrees of freedom and rich scaling behaviors that do not appear in a “perfect” system. Further, we find that this imperfect adaptation, along with physical limitations on the adjustable parameters (e.g. conductance cannot be infinite or negative) suppresses overfitting, a signature of the learning transition from under to over-parameterization. Broad distinctions between digital and physical learning likely exist, making CLLNs a flexible, manageable, and useful tool for interrogating emergent physical learning, with implications for memory in materials, biological systems, and energy-efficient computing.
1) Dillavou, Stern, Liu, & Durian, Demonstration of Decentralized Physics-Driven Learning. PR Applied (2022)
2) Dillavou, Beyer, Stern, Liu, Miskin*, & Durian*, Machine learning without a processor: Emergent learning in a nonlinear analog network. PNAS (2024).
Here, we investigate the direct effects of this physicality. We find that forgetting is an unavoidable consequence of imperfect adaptation. As a result, learning two tasks creates limit cycles in the learning degrees of freedom and rich scaling behaviors that do not appear in a “perfect” system. Further, we find that this imperfect adaptation, along with physical limitations on the adjustable parameters (e.g. conductance cannot be infinite or negative) suppresses overfitting, a signature of the learning transition from under to over-parameterization. Broad distinctions between digital and physical learning likely exist, making CLLNs a flexible, manageable, and useful tool for interrogating emergent physical learning, with implications for memory in materials, biological systems, and energy-efficient computing.
1) Dillavou, Stern, Liu, & Durian, Demonstration of Decentralized Physics-Driven Learning. PR Applied (2022)
2) Dillavou, Beyer, Stern, Liu, Miskin*, & Durian*, Machine learning without a processor: Emergent learning in a nonlinear analog network. PNAS (2024).
–
Presenters
-
Sam J Dillavou
University of Pennsylvania
Authors
-
Sam J Dillavou
University of Pennsylvania
-
Jason W Rocks
Boston University
-
Jacob Wycoff
University of Pennsylvania
-
Andrea J Liu
University of Pennsylvania
-
Marcelo Guzmán
University of Pennsylvania
-
Douglas J Durian
University of Pennsylvania