Continuous Variable Quantum Boltzmann Machines
ORAL
Abstract
Boltzmann Machines (BMs) are machine learning models which offer powerful framework for modelling probability distributions. In BMs, the probability distribution of the data is approximated based on a finite set of samples. After a successful training process, the learned distribution resembles to the actual distribution of the data such that it can make correct predictions about unseen instances. However, the generalization of the model suffers from growing number of parameters and training of a classical BM can become impractical. For these reasons, quantum Boltzmann machines (QBMs) have been proposed. The QBM models developed so far utilize the discrete-variable quantum computing model (based on qubits) and this framework is only partially suited for continuous valued data. It is more natural to extend the QBM model to continuous variable quantum computing (CVQC) model to study the continuous data. In this study, we propose a CV QBM that utilizes previously developed CV quantum imaginary time evolution algorithm. We also discuss how to implement this model to classification and synthetic data generation problems.
–
Presenters
-
Kubra Yeter-Aydeniz
Mitre Corp, MITRE
Authors
-
Kubra Yeter-Aydeniz
Mitre Corp, MITRE
-
George Siopsis
University of Tennessee