APS Logo

Physics for local learning

ORAL · Invited

Abstract

Artificial neural networks learn by minimizing a loss function with a computer to achieve the desired result. Alternatively, many forms of neuromorphic computing use local learning rules inspired by biological learning. We adopt a different approach, focusing on far simpler networks that exploit physics to both perform the forward computation and to obtain local learning rules that replace back propagation. Our Coupled Learning framework, related to Equilibrium Propagation, can potentially be implemented in mechanical and fluidic networks. It has been realized by our collaborators in laboratory electrical networks, one using digital variable resistors and the other using transistors, paving the way for micro fabrication of VLSI realizations. I will discuss back-of-the-envelope calculations for the scaling that we expect of this learning platform compared to that of conventional artificial neural networks.

Publication: M. Stern, D. Hexner, J. W. Rocks, A. J. Liu, Phys. Rev. X 11, 021045 (2021). DOI: https://doi.org/10.1103/PhysRevX.11.021045<br>"Supervised learning in physical networks: from machine learning to learning machines."<br><br>J. Wycoff, S. Dillavou, M. Stern, A. J. Liu, D. J. Durian, J. Chem. Phys. 156, 144903 (2022); https://doi.org/10.1063/5.0084631<br>"Asynchronous Learning in a Physics-Driven Learning Network."<br><br>M. Stern, S. Dillavou, M. Z. Miskin, D. J. Durian and A. J. Liu, Phys. Rev. Research 4 L022037 (2022).<br>"Physical learning beyond the quasistatic limit."<br><br>S. Dillavou, M. Stern, A. J. Liu and D. J. Durian, Phys. Rev. Applied 18 014040<br>(2022).<br>"Demonstration of decentralized, physics-driven learning."

Presenters

  • Andrea J Liu

    University of Pennsylvania

Authors

  • Andrea J Liu

    University of Pennsylvania