• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过消息传递从有限数据中进行无监督特征学习:不连续与连续相变

Unsupervised feature learning from finite data by message passing: Discontinuous versus continuous phase transition.

作者信息

Huang Haiping, Toyoizumi Taro

机构信息

RIKEN Brain Science Institute, Wako-shi, Saitama 351-0198, Japan.

出版信息

Phys Rev E. 2016 Dec;94(6-1):062310. doi: 10.1103/PhysRevE.94.062310. Epub 2016 Dec 21.

DOI:10.1103/PhysRevE.94.062310
PMID:28085373
Abstract

Unsupervised neural network learning extracts hidden features from unlabeled training data. This is used as a pretraining step for further supervised learning in deep networks. Hence, understanding unsupervised learning is of fundamental importance. Here, we study the unsupervised learning from a finite number of data, based on the restricted Boltzmann machine where only one hidden neuron is considered. Our study inspires an efficient message-passing algorithm to infer the hidden feature and estimate the entropy of candidate features consistent with the data. Our analysis reveals that the learning requires only a few data if the feature is salient and extensively many if the feature is weak. Moreover, the entropy of candidate features monotonically decreases with data size and becomes negative (i.e., entropy crisis) before the message passing becomes unstable, suggesting a discontinuous phase transition. In terms of convergence time of the message-passing algorithm, the unsupervised learning exhibits an easy-hard-easy phenomenon as the training data size increases. All these properties are reproduced in an approximate Hopfield model, with an exception that the entropy crisis is absent, and only continuous phase transition is observed. This key difference is also confirmed in a handwritten digits dataset. This study deepens our understanding of unsupervised learning from a finite number of data and may provide insights into its role in training deep networks.

摘要

无监督神经网络学习从无标签训练数据中提取隐藏特征。这被用作深度网络中进一步监督学习的预训练步骤。因此,理解无监督学习至关重要。在此,我们基于仅考虑一个隐藏神经元的受限玻尔兹曼机,研究从有限数量数据中进行的无监督学习。我们的研究启发了一种高效的消息传递算法,用于推断隐藏特征并估计与数据一致的候选特征的熵。我们的分析表明,如果特征显著,学习仅需要少量数据;如果特征微弱,则需要大量数据。此外,候选特征的熵随数据大小单调递减,并在消息传递变得不稳定之前变为负数(即熵危机),这表明存在不连续的相变。就消息传递算法的收敛时间而言,随着训练数据大小的增加,无监督学习呈现出易-难-易现象。所有这些特性在一个近似霍普菲尔德模型中得到了重现,唯一不同的是不存在熵危机,仅观察到连续相变。这一关键差异在一个手写数字数据集中也得到了证实。这项研究加深了我们对从有限数量数据中进行无监督学习的理解,并可能为其在深度网络训练中的作用提供见解。

相似文献

1
Unsupervised feature learning from finite data by message passing: Discontinuous versus continuous phase transition.通过消息传递从有限数据中进行无监督特征学习:不连续与连续相变
Phys Rev E. 2016 Dec;94(6-1):062310. doi: 10.1103/PhysRevE.94.062310. Epub 2016 Dec 21.
2
Incremental learning by message passing in hierarchical temporal memory.通过分层时间记忆中的消息传递进行增量学习。
Neural Comput. 2014 Aug;26(8):1763-809. doi: 10.1162/NECO_a_00617. Epub 2014 May 30.
3
Mean-field message-passing equations in the Hopfield model and its generalizations.霍普菲尔德模型及其推广中的平均场消息传递方程。
Phys Rev E. 2017 Feb;95(2-1):022117. doi: 10.1103/PhysRevE.95.022117. Epub 2017 Feb 14.
4
Learning representation hierarchies by sharing visual features: a computational investigation of Persian character recognition with unsupervised deep learning.通过共享视觉特征学习表征层次结构:基于无监督深度学习的波斯文字符识别的计算研究
Cogn Process. 2017 Aug;18(3):273-284. doi: 10.1007/s10339-017-0796-7. Epub 2017 Feb 25.
5
Discriminative Unsupervised Feature Learning with Exemplar Convolutional Neural Networks.基于示例卷积神经网络的判别式无监督特征学习。
IEEE Trans Pattern Anal Mach Intell. 2016 Sep;38(9):1734-47. doi: 10.1109/TPAMI.2015.2496141. Epub 2015 Oct 29.
6
Variational mean-field theory for training restricted Boltzmann machines with binary synapses.用于训练具有二元突触的受限玻尔兹曼机的变分平均场理论。
Phys Rev E. 2020 Sep;102(3-1):030301. doi: 10.1103/PhysRevE.102.030301.
7
On the equivalence of Hopfield networks and Boltzmann Machines.Hopfield 网络与 Boltzmann 机的等价性。
Neural Netw. 2012 Oct;34:1-9. doi: 10.1016/j.neunet.2012.06.003. Epub 2012 Jun 23.
8
Recognition of polymer configurations by unsupervised learning.通过无监督学习识别聚合物构型。
Phys Rev E. 2019 Apr;99(4-1):043307. doi: 10.1103/PhysRevE.99.043307.
9
Unsupervised feature learning for self-tuning neural networks.无监督特征学习用于自调谐神经网络。
Neural Netw. 2021 Jan;133:103-111. doi: 10.1016/j.neunet.2020.10.011. Epub 2020 Oct 22.
10
Message passing theory for percolation models on multiplex networks with link overlap.具有链路重叠的多重网络上渗流模型的消息传递理论。
Phys Rev E. 2016 Sep;94(3-1):032301. doi: 10.1103/PhysRevE.94.032301. Epub 2016 Sep 1.