Suppr超能文献

张量网络机器学习的互信息缩放

Mutual Information Scaling for Tensor Network Machine Learning.

作者信息

Convy Ian, Huggins William, Liao Haoran, Whaley K Birgitta

机构信息

Department of Chemistry, University of California, Berkeley, CA 94720, USA.

Berkeley Quantum Information and Computation Center, University of California, Berkeley, CA 94720, USA.

出版信息

Mach Learn Sci Technol. 2022 Mar;3(1). doi: 10.1088/2632-2153/ac44a9. Epub 2022 Jan 20.

Abstract

Tensor networks have emerged as promising tools for machine learning, inspired by their widespread use as variational ansatze in quantum many-body physics. It is well known that the success of a given tensor network ansatz depends in part on how well it can reproduce the underlying entanglement structure of the target state, with different network designs favoring different scaling patterns. We demonstrate here how a related correlation analysis can be applied to tensor network machine learning, and explore whether classical data possess correlation scaling patterns similar to those found in quantum states which might indicate the best network to use for a given dataset. We utilize mutual information as measure of correlations in classical data, and show that it can serve as a lower-bound on the entanglement needed for a probabilistic tensor network classifier. We then develop a logistic regression algorithm to estimate the mutual information between bipartitions of data features, and verify its accuracy on a set of Gaussian distributions designed to mimic different correlation patterns. Using this algorithm, we characterize the scaling patterns in the MNIST and Tiny Images datasets, and find clear evidence of boundary-law scaling in the latter. This quantum-inspired classical analysis offers insight into the design of tensor networks which are best suited for specific learning tasks.

摘要

受张量网络在量子多体物理中作为变分假设的广泛应用启发,它已成为机器学习中颇具前景的工具。众所周知,给定张量网络假设的成功部分取决于它能够多好地再现目标状态的潜在纠缠结构,不同的网络设计有利于不同的缩放模式。我们在此展示如何将相关的相关性分析应用于张量网络机器学习,并探索经典数据是否具有与量子态中发现的相似的相关性缩放模式,这可能表明针对给定数据集应使用的最佳网络。我们利用互信息作为经典数据中相关性的度量,并表明它可以作为概率张量网络分类器所需纠缠的下限。然后,我们开发了一种逻辑回归算法来估计数据特征二分之间的互信息,并在一组旨在模拟不同相关模式的高斯分布上验证其准确性。使用该算法,我们刻画了MNIST和Tiny Images数据集中的缩放模式,并在后一个数据集中发现了边界律缩放的明确证据。这种受量子启发的经典分析为最适合特定学习任务的张量网络设计提供了见解。

相似文献

1
Mutual Information Scaling for Tensor Network Machine Learning.张量网络机器学习的互信息缩放
Mach Learn Sci Technol. 2022 Mar;3(1). doi: 10.1088/2632-2153/ac44a9. Epub 2022 Jan 20.
4
Multipartite Entanglement in Stabilizer Tensor Networks.稳定器张量网络中的多方纠缠
Phys Rev Lett. 2020 Dec 11;125(24):241602. doi: 10.1103/PhysRevLett.125.241602.
7
Tensor networks for unsupervised machine learning.张量网络在无监督机器学习中的应用。
Phys Rev E. 2023 Jan;107(1):L012103. doi: 10.1103/PhysRevE.107.L012103.
10
Nonparametric tensor ring decomposition with scalable amortized inference.具有可扩展摊销推理的非参数张量环分解
Neural Netw. 2024 Jan;169:431-441. doi: 10.1016/j.neunet.2023.10.031. Epub 2023 Oct 27.

引用本文的文献

本文引用的文献

3
Color-to-grayscale: does the method matter in image recognition?彩色转灰度:图像识别中方法重要吗?
PLoS One. 2012;7(1):e29740. doi: 10.1371/journal.pone.0029740. Epub 2012 Jan 10.
5
Class of quantum many-body states that can be efficiently simulated.可被高效模拟的量子多体态类别。
Phys Rev Lett. 2008 Sep 12;101(11):110501. doi: 10.1103/PhysRevLett.101.110501.
7
Estimating mutual information.估计互信息。
Phys Rev E Stat Nonlin Soft Matter Phys. 2004 Jun;69(6 Pt 2):066138. doi: 10.1103/PhysRevE.69.066138. Epub 2004 Jun 23.
8
Efficient classical simulation of slightly entangled quantum computations.轻度纠缠量子计算的高效经典模拟。
Phys Rev Lett. 2003 Oct 3;91(14):147902. doi: 10.1103/PhysRevLett.91.147902. Epub 2003 Oct 1.
9
Entanglement in quantum critical phenomena.量子临界现象中的纠缠
Phys Rev Lett. 2003 Jun 6;90(22):227902. doi: 10.1103/PhysRevLett.90.227902. Epub 2003 Jun 2.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验