• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

量子神经切线核的可表达性诱导浓度

Expressibility-induced Concentration of Quantum Neural Tangent Kernels.

作者信息

Yu Li-Wei, Li Weikang, Ye Qi, Lu Zhide, Han Zizhao, Deng Dong-Ling

机构信息

Nankai University, Chern Institute of Mathematics, Tianjin, 300071, CHINA.

Tsinghua University, Center for Quantum Information, IIIS, Beijing, 100084, CHINA.

出版信息

Rep Prog Phys. 2024 Oct 3. doi: 10.1088/1361-6633/ad82cf.

DOI:10.1088/1361-6633/ad82cf
PMID:39360390
Abstract

Quantum tangent kernel methods provide an efficient approach to analyzing the performance of quantum machine learning models in the infinite-width limit, which is of crucial importance in designing appropriate circuit architectures for certain learning tasks. Recently, they have been adapted to describe the convergence rate of training errors in quantum neural networks in an analytical manner. Here, we study the connections between the expressibility and value concentration of quantum tangent kernel models. In particular, for global loss functions, we rigorously prove that high expressibility of both the global and local quantum encodings can lead to exponential concentration of quantum tangent kernel values to zero. Whereas for local loss functions, such issue of exponential concentration persists owing to the high expressibility, but can be partially mitigated. We further carry out extensive numerical simulations to support our analytical theories. Our discoveries unveil a fundamental feature of quantum neural tangent kernels, indicating that the issue of their concentration cannot be bypassed merely by transitioning to a local encoding scheme while maintaining high expressibility. This offers valuable insights for the design of wide quantum variational circuit models in practical applications.

摘要

量子切核方法为分析量子机器学习模型在无限宽度极限下的性能提供了一种有效途径,这对于为特定学习任务设计合适的电路架构至关重要。最近,它们已被用于以解析方式描述量子神经网络中训练误差的收敛速度。在此,我们研究量子切核模型的可表达性和值集中性之间的联系。特别地,对于全局损失函数,我们严格证明全局和局部量子编码的高可表达性都可导致量子切核值指数级集中到零。而对于局部损失函数,由于高可表达性,这种指数级集中问题依然存在,但可以部分缓解。我们进一步进行了广泛的数值模拟以支持我们的解析理论。我们的发现揭示了量子神经切核的一个基本特征,表明仅通过在保持高可表达性的同时过渡到局部编码方案并不能回避其集中问题。这为实际应用中宽量子变分电路模型的设计提供了有价值的见解。

相似文献

1
Expressibility-induced Concentration of Quantum Neural Tangent Kernels.量子神经切线核的可表达性诱导浓度
Rep Prog Phys. 2024 Oct 3. doi: 10.1088/1361-6633/ad82cf.
2
Exponential concentration in quantum kernel methods.量子核方法中的指数浓度。
Nat Commun. 2024 Jun 18;15(1):5200. doi: 10.1038/s41467-024-49287-w.
3
Simple, fast, and flexible framework for matrix completion with infinite width neural networks.具有无限宽度神经网络的矩阵完成的简单、快速和灵活框架。
Proc Natl Acad Sci U S A. 2022 Apr 19;119(16):e2115064119. doi: 10.1073/pnas.2115064119. Epub 2022 Apr 11.
4
Practical application of quantum neural network to materials informatics.量子神经网络在材料信息学中的实际应用。
Sci Rep. 2024 Apr 13;14(1):8583. doi: 10.1038/s41598-024-59276-0.
5
Analytic Theory for the Dynamics of Wide Quantum Neural Networks.宽量子神经网络动力学的解析理论。
Phys Rev Lett. 2023 Apr 14;130(15):150601. doi: 10.1103/PhysRevLett.130.150601.
6
Quantum Neural Network Inspired Hardware Adaptable Ansatz for Efficient Quantum Simulation of Chemical Systems.受量子神经网络启发的硬件适应性方法用于化学系统的高效量子模拟
J Chem Theory Comput. 2023 Dec 12;19(23):8587-8597. doi: 10.1021/acs.jctc.3c00527. Epub 2023 Dec 4.
7
Universal expressiveness of variational quantum classifiers and quantum kernels for support vector machines.变分量子分类器和支持向量机量子核的普适表达能力。
Nat Commun. 2023 Feb 2;14(1):576. doi: 10.1038/s41467-023-36144-5.
8
Quantum Physics-Informed Neural Networks.量子物理启发的神经网络。
Entropy (Basel). 2024 Jul 30;26(8):649. doi: 10.3390/e26080649.
9
Presence and Absence of Barren Plateaus in Tensor-Network Based Machine Learning.张量网络机器学习中的贫瘠高原的存在与缺失。
Phys Rev Lett. 2022 Dec 30;129(27):270501. doi: 10.1103/PhysRevLett.129.270501.
10
Spectral bias and task-model alignment explain generalization in kernel regression and infinitely wide neural networks.谱偏差和任务模型对齐解释了核回归和无限宽神经网络中的泛化。
Nat Commun. 2021 May 18;12(1):2914. doi: 10.1038/s41467-021-23103-1.

引用本文的文献

1
Quantum machine learning for Lyapunov-stabilized computation offloading in next-generation MEC networks.用于下一代移动边缘计算网络中李雅普诺夫稳定计算卸载的量子机器学习
Sci Rep. 2025 Jan 2;15(1):405. doi: 10.1038/s41598-024-84441-w.
2
Dynamical transition in controllable quantum neural networks with large depth.具有大深度的可控量子神经网络中的动力学转变
Nat Commun. 2024 Oct 29;15(1):9354. doi: 10.1038/s41467-024-53769-2.