APS Logo

Equilibrium Propagation: A Physics-Grounded Theory of Computation and Learning

ORAL · Invited

Abstract

We present a mathematical framework of computation and learning grounded in physical principles, called "equilibrium propagation" (Eqprop). This framework is compatible with gradient-descent optimization -- the workhorse of deep learning -- but in Eqprop, inference and gradient computation are achieved using the same physical laws, and the learning rule for each trainable parameter (or `weight') is local. We apply Eqprop to a class of physical systems dubbed "deep resistive networks" (DRNs), i.e. electrical circuits composed of resistors and diodes, in which the conductances of variables resistors play the role of trainable parameters, and diodes play the role of nonlinearities. We show that DRNs have another essential feature of deep learning: they are universal approximators (i.e. they can represent arbitrary input-output functions), like deep neural networks are. We also present a fast algorithm to simulate DRNs and Eqprop on digital computers, and we demonstrate the potential of the framework on standard machine learning benchmarks. Altogether, we contend that our framework can guide the development of fast, compact and low-power hardware for AI (i.e. "learning machines"), in which inference and learning are performed efficiently. Such hardware is expected to be several orders of magnitude faster and more energy-efficient than conventional neural networks run and trained on GPUs.

We show that the Eqprop learning framework could also be relevant to physicists, chemists, biologists and neuroscientists who seek to understand how natural systems such as the brain learn from (or adapt to) their environment.

Publication: Equilibrium propagation: bridging the gap between energy-based models and backpropagation<br>Training end-to-end analog neural networks with equilibrium propagation<br>A deep learning theory for neural networks grounded in physics<br>Agnostic physics-driven deep learning<br>Frequency propagation: multi-mechanism learning in nonlinear physical networks<br>A universal approximation theorem for deep resistive networks<br>A fast algorithm to simulate deep resistive networks

Presenters

  • Benjamin Scellier

    Rain Neuromorphics, ETH Zurich

Authors

  • Benjamin Scellier

    Rain Neuromorphics, ETH Zurich