Quantum Long Short-Term Memory
ORAL
Abstract
Recurrent neural networks (RNN) have been used to model data with sequential and temporal dependency. One of its variants, Long Short-Term Memory (LSTM), has been successfully applied to a wide spectrum of such tasks. In this talk, we propose a model of LSTM based on the hybrid quantum-classical paradigm, which we call QLSTM. The proposed architecture is successful in several testing cases with temporal or sequential dependencies. In particular, we show that for certain scenarios, our quantum version of LSTM learns faster or reaches the optimal accuracies faster than its classical counterpart. In addition, with variational quantum circuits as the building blocks, the proposed architecture provides potential applications in the Noisy Intermediate-Scale Quantum (NISQ) devices.
–
Presenters
-
Samuel Yen-Chi Chen
Brookhaven National Laboratory
Authors
-
Samuel Yen-Chi Chen
Brookhaven National Laboratory
-
Shinjae Yoo
Brookhaven National Laboratory
-
Yao-Lung L. Fang
Brookhaven National Laboratory