APS Logo

Techniques for design and training of large quantum machine learning models

ORAL

Abstract

Despite the early promise shown by the success of training small variational quantum circuits for machine learning tasks on classical data, scaling these methods remains a challenge. Two critical open questions are (a) how to load classical data in a way that scales efficiently in quantum resources while keeping high expressivity, and (b) how to train models with a large number of parameters when backpropagation, the technique commonly used in classical neural networks to calculate gradients, is not applicable. Here, we address both of these issues. We first show empirically that ultra-sparse encoding, in which just a few bits are extracted from the dataset for each sample for loading onto the quantum computer, can achieve highly accurate training. Secondly, we will discuss a technique that utilizes the linearity of quantum models to train them without the use of a classical optimizer thus not requiring gradient calculation. We will also discuss how to use physical principles to set various hyperparameters such as the batch size and number of shots. Finally, we will introduce a software platform that provides the necessary abstractions to implement these techniques.

Presenters

  • Sonika Johri

    Coherent Computing Inc

Authors

  • Sonika Johri

    Coherent Computing Inc