APS Logo

Overcoming Catastrophic Forgetting in Physical Systems

ORAL

Abstract

Physical systems such as Contrastive Local Learning Networks (CLLNs) composed of adjustable resistors, can autonomously learn complex tasks in a decentralized, fast, and power-efficient manner. However, like artificial neural networks, physical learning systems face the persistent challenge of catastrophic forgetting— the tendency to lose previously learned functionality upon learning new tasks. Here we show that thresholds imposed on the physical learning rule can improve memory retention by efficiently partitioning learning degrees of freedom among different tasks. We simulate disordered resistor networks and sequentially train them on two types of tasks—edge allostery or linear regression— with different threshold levels for small updates. We find a rich trade-off between the degree of over-parametrization, error response and memory, particularly when the spatial extents of the sequential tasks are large and overlapping. An optimal level of thresholding not only improves memory without the need to identify and preserve resistances important to previously learned tasks, but also reduces the material cost of the training process. Our results suggest that updating only for changes above a threshold is a useful strategy that improves memory, increases the number of learned tasks, and reduces the energetic cost of learning in physical systems. It may also be biologically plausible.

Publication: Overcoming Catastrophic Forgetting in Physical Learning Machines (in preparation)

Presenters

  • Purba Chatterjee

    University of Pennsylvania

Authors

  • Purba Chatterjee

    University of Pennsylvania

  • Marcelo Guzmán

    University of Pennsylvania

  • Andrea J Liu

    University of Pennsylvania