Abbas Amira, Sutter David, Zoufal Christa, Lucchi Aurelien, Figalli Alessio, Woerner Stefan
IBM Quantum, IBM Research-Zurich, Rueschlikon, Switzerland.
University of KwaZulu-Natal, Durban, South Africa.
Nat Comput Sci. 2021 Jun;1(6):403-409. doi: 10.1038/s43588-021-00084-1. Epub 2021 Jun 24.
It is unknown whether near-term quantum computers are advantageous for machine learning tasks. In this work we address this question by trying to understand how powerful and trainable quantum machine learning models are in relation to popular classical neural networks. We propose the effective dimension-a measure that captures these qualities-and prove that it can be used to assess any statistical model's ability to generalize on new data. Crucially, the effective dimension is a data-dependent measure that depends on the Fisher information, which allows us to gauge the ability of a model to train. We demonstrate numerically that a class of quantum neural networks is able to achieve a considerably better effective dimension than comparable feedforward networks and train faster, suggesting an advantage for quantum machine learning, which we verify on real quantum hardware.
尚不清楚近期的量子计算机对于机器学习任务是否具有优势。在这项工作中,我们试图通过了解量子机器学习模型相对于流行的经典神经网络的强大程度和可训练性来解决这个问题。我们提出了有效维度——一种捕捉这些特性的度量——并证明它可用于评估任何统计模型对新数据进行泛化的能力。至关重要的是,有效维度是一种依赖于数据的度量,它取决于费希尔信息,这使我们能够衡量模型的训练能力。我们通过数值证明,一类量子神经网络能够比可比的前馈网络实现显著更好的有效维度,并且训练速度更快,这表明量子机器学习具有优势,我们在真实量子硬件上验证了这一点。