APS Logo

Fast Quantum-Assisted Sampling as a Potential Source of Quantum Advantage at Training Generative Models with latent Variables

ORAL

Abstract

Approximating the underlying structure of real-world data is a central quest in unsupervised machine learning, where generative models with latent variables have proven to be a powerful tool. Restricted Boltzmann machines (RBMs) and Variational Autoencoders are important graphical models capable of learning multi-modal distributions over high-dimensional datasets. However, Log-likelihood gradients must be approximated via sampling, which generally requires computationally expensive MCMC chains. Given this challenging task of approximating a thermal state, quantum annealers are promising candidates to sample classical or quantum Gibbs distributions, replacing slow classical MCMC schemes. In particular, we introduce a non-conventional annealing protocol so-called Random Frequency Quantum Annealing (RFQA) [1] , a promising candidate to offer noise tolerant speed ups for optimization and sampling tasks. This work explores the performance of VAEs [2] and RBMs trained with state of the art--and computationally expensive--classical sampling algorithms, such as Persistent Contrastive divergence (PCD), gradient centered methods [3], Parallel tempering (PT) and Contrastive divergence (CD-k) with several k steps, as proxies for the quantum device. Results on image reconstruction assessed on the MNIST dataset show that gradients estimated with samples close to the equilibrium Gibbs distribution generalize better, furnishing a less biased and higher Log-likelihood scores even when facing data scarcity. That is on numerical experiments ran with reduced samples containing only 4% of the original data. Finally, it is shown that deep convolutional VAEs with BMs priors placed in the latent space achieve higher LL scores than the commonly used gaussian priors.

[1] Kapit, Eliot, and Vadim Oganesyan. "Noise-tolerant quantum speedups in quantum annealing without fine tuning." Quantum Science and Technology 6.2 (2021): 025013.

[2] Winci, Walter, et al. "A path towards quantum advantage in training deep generative models with quantum annealers." Machine Learning: Science and Technology 1.4 (2020): 045028.

[3] Melchior, Jan, Asja Fischer, and Laurenz Wiskott. "How to center deep Boltzmann machines." The Journal of Machine Learning Research 17.1 (2016): 3387-3447.

Presenters

  • Carla M Quispe Flores

    Colorado School Of Mines

Authors

  • Carla M Quispe Flores

    Colorado School Of Mines