Image Classification Via Reversible Analog Superconducting Dynamics
ORAL
Abstract
Landauer’s principle holds that logically irreversible digital operations are linked to physically irreversible state transitions, implying that irreversible operations must produce a finite amount of entropy. Dissipating this entropy to a thermal bath at finite frequency requires a minimum amount of power. To break this fundamental limit to power efficiency of digital devices, physically reversible hardware have been proposed. Digital devices are also being challenged by the rise of machine learning architectures which are often analog and inspired from analog biological neural networks (NN 1). This talk discusses the reversible limit of analog computing, presenting simulations of image classification using a network of superconducting-based analog flux parametrons (AFP 2). Utilizing Hamiltonian dynamics, these simulations explore realistic reversible and near-reversible hardware. We discuss how information chaos arises within phase-encoded AFP dynamics and the methods used to mitigate it. We show that even when mitigating for chaotic effects, energy dissipation per operation falls below Landauer's limit. Finally, we compare the success of various configurations of AFP-based NN implementing image classification benchmarks to the performance of related conventional artificial NN, finding similar accuracy.
1 Wright et al ‘22; Deep Physical Neural Networks Trained with Backpropagation
2 Hosoya et al ‘91; Quantum flux parametron: a single quantum flux device for Josephson supercomputer
1 Wright et al ‘22; Deep Physical Neural Networks Trained with Backpropagation
2 Hosoya et al ‘91; Quantum flux parametron: a single quantum flux device for Josephson supercomputer
–
Presenters
-
Ian Christie
Northrop Grumman Corporation
Authors
-
Ian Christie
Northrop Grumman Corporation