TY - JOUR T1 - Generalization in quantum machine learning from few training data Y1 - 2021 A1 - Matthias C. Caro A1 - Hsin-Yuan Huang A1 - M. Cerezo A1 - Kunal Sharma A1 - Andrew Sornborger A1 - Lukasz Cincio A1 - Patrick J. Coles AB -

Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set, and subsequently making predictions on a testing data set (i.e., generalizing). In this work, we provide a comprehensive study of generalization performance in QML after training on a limited number N of training data points. We show that the generalization error of a quantum machine learning model with T trainable gates scales at worst as T/N−−−−√. When only K≪T gates have undergone substantial change in the optimization process, we prove that the generalization error improves to K/N−−−−√. Our results imply that the compiling of unitaries into a polynomial number of native gates, a crucial application for the quantum computing industry that typically uses exponential-size training data, can be sped up significantly. We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set. Other potential applications include learning quantum error correcting codes or quantum dynamical simulation. Our work injects new hope into the field of QML, as good generalization is guaranteed from few training data.

UR - https://arxiv.org/abs/2111.05292 ER - TY - JOUR T1 - Noise-induced barren plateaus in variational quantum algorithms JF - Nature Communications Y1 - 2021 A1 - Samson Wang A1 - Enrico Fontana A1 - M. Cerezo A1 - Kunal Sharma A1 - Akira Sone A1 - Lukasz Cincio A1 - Patrick J. Coles AB -

Variational Quantum Algorithms (VQAs) may be a path to quantum advantage on Noisy Intermediate-Scale Quantum (NISQ) computers. A natural question is whether noise on NISQ devices places fundamental limitations on VQA performance. We rigorously prove a serious limitation for noisy VQAs, in that the noise causes the training landscape to have a barren plateau (i.e., vanishing gradient). Specifically, for the local Pauli noise considered, we prove that the gradient vanishes exponentially in the number of qubits n if the depth of the ansatz grows linearly with n. These noise-induced barren plateaus (NIBPs) are conceptually different from noise-free barren plateaus, which are linked to random parameter initialization. Our result is formulated for a generic ansatz that includes as special cases the Quantum Alternating Operator Ansatz and the Unitary Coupled Cluster Ansatz, among others. For the former, our numerical heuristics demonstrate the NIBP phenomenon for a realistic hardware noise model.

VL - 12 U4 - 6961 U5 - https://doi.org/10.1038/s41467-021-27045-6 ER -