Suppr超能文献

训练深度量子神经网络。

Training deep quantum neural networks.

机构信息

Institut für Theoretische Physik, Leibniz Universität Hannover, Appelstraße 2, 30167, Hannover, Germany.

ARC Centre for Engineered Quantum Systems, School of Mathematics and Physics, University of Queensland, Brisbane, QLD, 4072, Australia.

出版信息

Nat Commun. 2020 Feb 10;11(1):808. doi: 10.1038/s41467-020-14454-2.

Abstract

Neural networks enjoy widespread success in both research and industry and, with the advent of quantum technology, it is a crucial challenge to design quantum neural networks for fully quantum learning tasks. Here we propose a truly quantum analogue of classical neurons, which form quantum feedforward neural networks capable of universal quantum computation. We describe the efficient training of these networks using the fidelity as a cost function, providing both classical and efficient quantum implementations. Our method allows for fast optimisation with reduced memory requirements: the number of qudits required scales with only the width, allowing deep-network optimisation. We benchmark our proposal for the quantum task of learning an unknown unitary and find remarkable generalisation behaviour and a striking robustness to noisy training data.

摘要

神经网络在研究和工业领域都取得了广泛的成功,随着量子技术的出现,设计用于全量子学习任务的量子神经网络是一个关键的挑战。在这里,我们提出了经典神经元的真正量子模拟,它形成了能够进行通用量子计算的量子前馈神经网络。我们使用保真度作为代价函数来描述这些网络的有效训练,提供了经典和高效的量子实现。我们的方法允许快速优化,同时减少内存需求:所需量子比特的数量仅与宽度有关,从而允许进行深度网络优化。我们针对学习未知幺正的量子任务对我们的建议进行了基准测试,发现它具有显著的泛化行为和对嘈杂训练数据的惊人鲁棒性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9175/7010779/d5c976e44c1d/41467_2020_14454_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验