• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

概率建模的信息视角:玻尔兹曼机与博恩机器

Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines.

作者信息

Cheng Song, Chen Jing, Wang Lei

机构信息

Institute of Physics, Chinese Academy of Sciences, Beijing 100190, China.

School of Physical Sciences, University of Chinese Academy of Sciences, Beijing 100049, China.

出版信息

Entropy (Basel). 2018 Aug 7;20(8):583. doi: 10.3390/e20080583.

DOI:10.3390/e20080583
PMID:33265672
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7513111/
Abstract

We compare and contrast the statistical physics and quantum physics inspired approaches for unsupervised generative modeling of classical data. The two approaches represent probabilities of observed data using energy-based models and quantum states, respectively. Classical and quantum information patterns of the target datasets therefore provide principled guidelines for structural design and learning in these two approaches. Taking the Restricted Boltzmann Machines (RBM) as an example, we analyze the information theoretical bounds of the two approaches. We also estimate the classical mutual information of the standard MNIST datasets and the quantum Rényi entropy of corresponding Matrix Product States (MPS) representations. Both information measures are much smaller compared to their theoretical upper bound and exhibit similar patterns, which imply a common inductive bias of low information complexity. By comparing the performance of RBM with various architectures on the standard MNIST datasets, we found that the RBM with local sparse connection exhibit high learning efficiency, which supports the application of tensor network states in machine learning problems.

摘要

我们比较并对比了受统计物理学和量子物理学启发的用于经典数据无监督生成建模的方法。这两种方法分别使用基于能量的模型和量子态来表示观测数据的概率。因此,目标数据集的经典和量子信息模式为这两种方法的结构设计和学习提供了原则性指导。以受限玻尔兹曼机(RBM)为例,我们分析了这两种方法的信息理论界限。我们还估计了标准MNIST数据集的经典互信息以及相应矩阵乘积态(MPS)表示的量子雷尼熵。与它们的理论上限相比,这两种信息度量都要小得多,并且呈现出相似的模式,这意味着存在低信息复杂度的共同归纳偏差。通过比较不同架构的RBM在标准MNIST数据集上的性能,我们发现具有局部稀疏连接的RBM表现出较高的学习效率,这支持了张量网络态在机器学习问题中的应用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/f64a60c4f0f8/entropy-20-00583-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/017d20504870/entropy-20-00583-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/4b31f38f4fb5/entropy-20-00583-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/e28b7b00e7ac/entropy-20-00583-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/de1aeeb86923/entropy-20-00583-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/46ccbd5aaf73/entropy-20-00583-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/f64a60c4f0f8/entropy-20-00583-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/017d20504870/entropy-20-00583-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/4b31f38f4fb5/entropy-20-00583-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/e28b7b00e7ac/entropy-20-00583-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/de1aeeb86923/entropy-20-00583-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/46ccbd5aaf73/entropy-20-00583-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f71e/7513111/f64a60c4f0f8/entropy-20-00583-g006.jpg

相似文献

1
Information Perspective to Probabilistic Modeling: Boltzmann Machines versus Born Machines.概率建模的信息视角:玻尔兹曼机与博恩机器
Entropy (Basel). 2018 Aug 7;20(8):583. doi: 10.3390/e20080583.
2
Tensor networks for unsupervised machine learning.张量网络在无监督机器学习中的应用。
Phys Rev E. 2023 Jan;107(1):L012103. doi: 10.1103/PhysRevE.107.L012103.
3
Mutual Information Scaling for Tensor Network Machine Learning.张量网络机器学习的互信息缩放
Mach Learn Sci Technol. 2022 Mar;3(1). doi: 10.1088/2632-2153/ac44a9. Epub 2022 Jan 20.
4
Barriers and dynamical paths in alternating Gibbs sampling of restricted Boltzmann machines.受限玻尔兹曼机交替吉布斯采样中的障碍与动态路径
Phys Rev E. 2021 Sep;104(3-1):034109. doi: 10.1103/PhysRevE.104.034109.
5
Thermodynamics of the Ising Model Encoded in Restricted Boltzmann Machines.受限玻尔兹曼机中编码的伊辛模型的热力学
Entropy (Basel). 2022 Nov 22;24(12):1701. doi: 10.3390/e24121701.
6
Dynamic Topology Reconfiguration of Boltzmann Machines on Quantum Annealers.量子退火器上玻尔兹曼机的动态拓扑重构
Entropy (Basel). 2020 Oct 24;22(11):1202. doi: 10.3390/e22111202.
7
Expected energy-based restricted Boltzmann machine for classification.预期基于能量的受限玻尔兹曼机分类。
Neural Netw. 2015 Apr;64:29-38. doi: 10.1016/j.neunet.2014.09.006. Epub 2014 Sep 28.
8
Entropy, Free Energy, and Work of Restricted Boltzmann Machines.受限玻尔兹曼机的熵、自由能与功
Entropy (Basel). 2020 May 11;22(5):538. doi: 10.3390/e22050538.
9
Measuring the usefulness of hidden units in Boltzmann machines with mutual information.用互信息衡量玻尔兹曼机中隐藏单元的有用性。
Neural Netw. 2015 Apr;64:12-8. doi: 10.1016/j.neunet.2014.09.004. Epub 2014 Sep 28.
10
Recent Advances for Quantum Neural Networks in Generative Learning.量子神经网络在生成学习中的最新进展
IEEE Trans Pattern Anal Mach Intell. 2023 Oct;45(10):12321-12340. doi: 10.1109/TPAMI.2023.3272029. Epub 2023 Sep 5.

引用本文的文献

1
Parameterized quantum circuits as universal generative models for continuous multivariate distributions.作为连续多元分布通用生成模型的参数化量子电路。
npj Quantum Inf. 2025;11(1):121. doi: 10.1038/s41534-025-01064-3. Epub 2025 Jul 22.
2
Enhancing combinatorial optimization with classical and quantum generative models.
Nat Commun. 2024 Mar 29;15(1):2761. doi: 10.1038/s41467-024-46959-5.
3
Mutual Information Scaling for Tensor Network Machine Learning.张量网络机器学习的互信息缩放

本文引用的文献

1
Neural Network Representation of Tensor Network and Chiral States.张量网络和手征态的神经网络表示。
Phys Rev Lett. 2021 Oct 22;127(17):170601. doi: 10.1103/PhysRevLett.127.170601.
2
Exploring cluster Monte Carlo updates with Boltzmann machines.探索玻尔兹曼机的聚类蒙特卡罗更新。
Phys Rev E. 2017 Nov;96(5-1):051301. doi: 10.1103/PhysRevE.96.051301. Epub 2017 Nov 16.
3
Efficient representation of quantum many-body states with deep neural networks.用深度神经网络高效表示量子多体态。
Mach Learn Sci Technol. 2022 Mar;3(1). doi: 10.1088/2632-2153/ac44a9. Epub 2022 Jan 20.
4
F-Divergences and Cost Function Locality in Generative Modelling with Quantum Circuits.量子电路生成建模中的F散度与代价函数局部性
Entropy (Basel). 2021 Sep 30;23(10):1281. doi: 10.3390/e23101281.
5
A high-bias, low-variance introduction to Machine Learning for physicists.面向物理学家的机器学习高偏差、低方差入门介绍。
Phys Rep. 2019 May 30;810:1-124. doi: 10.1016/j.physrep.2019.03.001. Epub 2019 Mar 14.
Nat Commun. 2017 Sep 22;8(1):662. doi: 10.1038/s41467-017-00705-2.
4
Neural Decoder for Topological Codes.拓扑码的神经解码器
Phys Rev Lett. 2017 Jul 21;119(3):030501. doi: 10.1103/PhysRevLett.119.030501. Epub 2017 Jul 18.
5
Emergence of Compositional Representations in Restricted Boltzmann Machines.受限玻尔兹曼机中组合表示的出现。
Phys Rev Lett. 2017 Mar 31;118(13):138301. doi: 10.1103/PhysRevLett.118.138301. Epub 2017 Mar 28.
6
Solving the quantum many-body problem with artificial neural networks.用人工神经网络解决量子多体问题。
Science. 2017 Feb 10;355(6325):602-606. doi: 10.1126/science.aag2302.
7
Retrieval capabilities of hierarchical networks: from Dyson to Hopfield.层次网络的检索能力:从戴森到霍普菲尔德。
Phys Rev Lett. 2015 Jan 16;114(2):028103. doi: 10.1103/PhysRevLett.114.028103.
8
Extensive parallel processing on scale-free networks.无标度网络上的广泛并行处理。
Phys Rev Lett. 2014 Dec 5;113(23):238106. doi: 10.1103/PhysRevLett.113.238106.
9
Measuring the usefulness of hidden units in Boltzmann machines with mutual information.用互信息衡量玻尔兹曼机中隐藏单元的有用性。
Neural Netw. 2015 Apr;64:12-8. doi: 10.1016/j.neunet.2014.09.004. Epub 2014 Sep 28.
10
Bulk entanglement spectrum reveals quantum criticality within a topological state.体纠缠谱揭示了拓扑态中的量子临界性。
Phys Rev Lett. 2014 Sep 5;113(10):106801. doi: 10.1103/PhysRevLett.113.106801. Epub 2014 Sep 4.