APS Logo

Deep physical neural networks using physics-aware training

ORAL

Abstract

Deep neural networks have become ubiquitous in today’s data-driven world, but their energy requirements increasingly limit their scalability and broader use. Here, we propose the construction of deep physical neural networks that are made from layers of controllable physical systems, which can learn hierarchical representations of input data analogous to deep neural networks. To train these physical neural networks, we introduce a hybrid in situ-in silico algorithm physics-aware training. This training method has favorable scaling properties as it uses backpropagation, the de-facto training method for deep neural networks. To demonstrate the universality of our approach, we train diverse physical neural networks based on optics, mechanics, and electronics to experimentally perform audio and image classification tasks. Our approach broadens the possibility of using novel physical systems for deep learning and potentially enables them to perform machine learning faster and more energy-efficiently than conventional electronic processors.

Publication: L. G. Wright, T. Onodera, M. M. Stein, T. Wang, D. T. Schachter, Z. Hu, and P. L. McMahon, Deep physical neural networks enabled by a backpropagation algorithm for arbitrary physical systems, arXiv:2104.13386 (2021).

Presenters

  • Tatsuhiro Onodera

    Cornell University, Cornell University & NTT Research

Authors

  • Tatsuhiro Onodera

    Cornell University, Cornell University & NTT Research

  • Logan G Wright

    Cornell University, Cornell University & NTT Research

  • Martin Stein

    Cornell University

  • Tianyu Wang

    Cornell University

  • Darren T Schachter

    Cornell University

  • Zoey Hu

    Cornell University

  • Peter L McMahon

    Cornell University, Stanford Univ