• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于树张量网络、CP秩约束和张量随机失活的机器学习

Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout.

作者信息

Chen Hao, Barthel Thomas

出版信息

IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):7825-7832. doi: 10.1109/TPAMI.2024.3396386. Epub 2024 Nov 6.

DOI:10.1109/TPAMI.2024.3396386
PMID:38696289
Abstract

Tensor networks developed in the context of condensed matter physics try to approximate order- N tensors with a reduced number of degrees of freedom that is only polynomial in N and arranged as a network of partially contracted smaller tensors. As we have recently demonstrated in the context of quantum many-body physics, computation costs can be further substantially reduced by imposing constraints on the canonical polyadic (CP) rank of the tensors in such networks. Here, we demonstrate how tree tensor networks (TTN) with CP rank constraints and tensor dropout can be used in machine learning. The approach is found to outperform other tensor-network-based methods in Fashion-MNIST image classification. A low-rank TTN classifier with branching ratio b=4 reaches a test set accuracy of 90.3% with low computation costs. Consisting of mostly linear elements, tensor network classifiers avoid the vanishing gradient problem of deep neural networks. The CP rank constraints have additional advantages: The number of parameters can be decreased and tuned more freely to control overfitting, improve generalization properties, and reduce computation costs. They allow us to employ trees with large branching ratios, substantially improving the representation power.

摘要

在凝聚态物理背景下发展起来的张量网络试图用自由度数量减少的情况来近似 N 阶张量,该自由度数量仅是 N 的多项式形式,并排列成由部分收缩的较小张量组成的网络。正如我们最近在量子多体物理背景下所证明的,通过对这类网络中张量的典范多adic(CP)秩施加约束,计算成本可以进一步大幅降低。在这里,我们展示了具有 CP 秩约束和张量随机失活的树张量网络(TTN)如何用于机器学习。结果发现,该方法在时尚 MNIST 图像分类中优于其他基于张量网络的方法。具有分支比 b = 4 的低秩 TTN 分类器以低计算成本达到了 90.3%的测试集准确率。张量网络分类器主要由线性元素组成,避免了深度神经网络的梯度消失问题。CP 秩约束还有其他优点:参数数量可以减少并更自由地调整,以控制过拟合、改善泛化特性并降低计算成本。它们使我们能够采用具有大分支比的树,从而大幅提高表示能力。

相似文献

1
Machine Learning With Tree Tensor Networks, CP Rank Constraints, and Tensor Dropout.基于树张量网络、CP秩约束和张量随机失活的机器学习
IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):7825-7832. doi: 10.1109/TPAMI.2024.3396386. Epub 2024 Nov 6.
2
Efficient Construction of Canonical Polyadic Approximations of Tensor Networks.张量网络规范多adic近似的高效构建
J Chem Theory Comput. 2023 Jan 10;19(1):71-81. doi: 10.1021/acs.jctc.2c00861. Epub 2022 Dec 9.
3
Block-term tensor neural networks.块张量神经网络。
Neural Netw. 2020 Oct;130:11-21. doi: 10.1016/j.neunet.2020.05.034. Epub 2020 Jun 7.
4
Stable tensor neural networks for efficient deep learning.用于高效深度学习的稳定张量神经网络。
Front Big Data. 2024 May 30;7:1363978. doi: 10.3389/fdata.2024.1363978. eCollection 2024.
5
Tensor Networks for Latent Variable Analysis: Higher Order Canonical Polyadic Decomposition.用于潜在变量分析的张量网络:高阶典范多adic分解
IEEE Trans Neural Netw Learn Syst. 2020 Jun;31(6):2174-2188. doi: 10.1109/TNNLS.2019.2929063. Epub 2019 Aug 26.
6
Spectral Super-Resolution via Deep Low-Rank Tensor Representation.基于深度低秩张量表示的光谱超分辨率
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):5140-5150. doi: 10.1109/TNNLS.2024.3359852. Epub 2025 Feb 28.
7
Bayesian CP Factorization of Incomplete Tensors with Automatic Rank Determination.贝叶斯 CP 因子分解具有自动秩确定的不完全张量。
IEEE Trans Pattern Anal Mach Intell. 2015 Sep;37(9):1751-63. doi: 10.1109/TPAMI.2015.2392756.
8
Tensor networks for unsupervised machine learning.张量网络在无监督机器学习中的应用。
Phys Rev E. 2023 Jan;107(1):L012103. doi: 10.1103/PhysRevE.107.L012103.
9
Presence and Absence of Barren Plateaus in Tensor-Network Based Machine Learning.张量网络机器学习中的贫瘠高原的存在与缺失。
Phys Rev Lett. 2022 Dec 30;129(27):270501. doi: 10.1103/PhysRevLett.129.270501.
10
Robust Approximation of Tensor Networks: Application to Grid-Free Tensor Factorization of the Coulomb Interaction.张量网络的稳健近似:在库仑相互作用无网格张量分解中的应用。
J Chem Theory Comput. 2021 Apr 13;17(4):2217-2230. doi: 10.1021/acs.jctc.0c01310. Epub 2021 Mar 29.