01877nas a2200181 4500008004100000245007000041210006900111260001400180520132200194100002301516700002101539700001501560700001801575700002301593700001901616700002301635856003701658 2021 eng d00aGeneralization in quantum machine learning from few training data0 aGeneralization in quantum machine learning from few training dat c11/9/20213 a
Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set, and subsequently making predictions on a testing data set (i.e., generalizing). In this work, we provide a comprehensive study of generalization performance in QML after training on a limited number N of training data points. We show that the generalization error of a quantum machine learning model with T trainable gates scales at worst as T/N−−−−√. When only K≪T gates have undergone substantial change in the optimization process, we prove that the generalization error improves to K/N−−−−√. Our results imply that the compiling of unitaries into a polynomial number of native gates, a crucial application for the quantum computing industry that typically uses exponential-size training data, can be sped up significantly. We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set. Other potential applications include learning quantum error correcting codes or quantum dynamical simulation. Our work injects new hope into the field of QML, as good generalization is guaranteed from few training data.
1 aCaro, Matthias, C.1 aHuang, Hsin-Yuan1 aCerezo, M.1 aSharma, Kunal1 aSornborger, Andrew1 aCincio, Lukasz1 aColes, Patrick, J. uhttps://arxiv.org/abs/2111.0529201661nas a2200205 4500008004100000245006800041210006700109260001500176300000900191490000700200520101600207100001701223700002001240700001501260700001801275700001601293700001901309700002301328856010401351 2021 eng d00aNoise-induced barren plateaus in variational quantum algorithms0 aNoiseinduced barren plateaus in variational quantum algorithms c11/29/2021 a69610 v123 aVariational Quantum Algorithms (VQAs) may be a path to quantum advantage on Noisy Intermediate-Scale Quantum (NISQ) computers. A natural question is whether noise on NISQ devices places fundamental limitations on VQA performance. We rigorously prove a serious limitation for noisy VQAs, in that the noise causes the training landscape to have a barren plateau (i.e., vanishing gradient). Specifically, for the local Pauli noise considered, we prove that the gradient vanishes exponentially in the number of qubits n if the depth of the ansatz grows linearly with n. These noise-induced barren plateaus (NIBPs) are conceptually different from noise-free barren plateaus, which are linked to random parameter initialization. Our result is formulated for a generic ansatz that includes as special cases the Quantum Alternating Operator Ansatz and the Unitary Coupled Cluster Ansatz, among others. For the former, our numerical heuristics demonstrate the NIBP phenomenon for a realistic hardware noise model.
1 aWang, Samson1 aFontana, Enrico1 aCerezo, M.1 aSharma, Kunal1 aSone, Akira1 aCincio, Lukasz1 aColes, Patrick, J. uhttps://www.quics.umd.edu/publications/noise-induced-barren-plateaus-variational-quantum-algorithms