Suppr超能文献

学习量子动力学的分布外泛化。

Out-of-distribution generalization for learning quantum dynamics.

机构信息

Department of Mathematics, Technical University of Munich, Garching, Germany.

Munich Center for Quantum Science and Technology (MCQST), Munich, Germany.

出版信息

Nat Commun. 2023 Jul 5;14(1):3751. doi: 10.1038/s41467-023-39381-w.

Abstract

Generalization bounds are a critical tool to assess the training data requirements of Quantum Machine Learning (QML). Recent work has established guarantees for in-distribution generalization of quantum neural networks (QNNs), where training and testing data are drawn from the same data distribution. However, there are currently no results on out-of-distribution generalization in QML, where we require a trained model to perform well even on data drawn from a different distribution to the training distribution. Here, we prove out-of-distribution generalization for the task of learning an unknown unitary. In particular, we show that one can learn the action of a unitary on entangled states having trained only product states. Since product states can be prepared using only single-qubit gates, this advances the prospects of learning quantum dynamics on near term quantum hardware, and further opens up new methods for both the classical and quantum compilation of quantum circuits.

摘要

泛化界是评估量子机器学习(QML)训练数据需求的重要工具。最近的工作已经为量子神经网络(QNN)的分布内泛化建立了保证,其中训练和测试数据来自同一数据分布。然而,目前在 QML 中还没有关于分布外泛化的结果,我们要求训练好的模型即使在来自与训练分布不同的数据上也能表现良好。在这里,我们证明了学习未知幺正的任务的分布外泛化。具体来说,我们表明,人们可以仅通过训练乘积态来学习幺正作用于纠缠态。由于乘积态仅使用单量子比特门即可制备,因此这提高了在近期量子硬件上学习量子动力学的前景,并进一步为量子电路的经典和量子编译开辟了新方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d81c/10322910/f6d61d8d337b/41467_2023_39381_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验