• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

具有冗余表示的神经网络:检测不可检测的。

Neural Networks with a Redundant Representation: Detecting the Undetectable.

机构信息

Dipartimento di Matematica "Guido Castelnuovo", Sapienza Università di Roma, 00185 Roma, Italy.

Dipartimento di Matematica e Fisica "Ennio De Giorgi", Università del Salento, 73100 Lecce, Italy.

出版信息

Phys Rev Lett. 2020 Jan 17;124(2):028301. doi: 10.1103/PhysRevLett.124.028301.

DOI:10.1103/PhysRevLett.124.028301
PMID:32004010
Abstract

We consider a three-layer Sejnowski machine and show that features learnt via contrastive divergence have a dual representation as patterns in a dense associative memory of order P=4. The latter is known to be able to Hebbian store an amount of patterns scaling as N^{P-1}, where N denotes the number of constituting binary neurons interacting P wisely. We also prove that, by keeping the dense associative network far from the saturation regime (namely, allowing for a number of patterns scaling only linearly with N, while P>2) such a system is able to perform pattern recognition far below the standard signal-to-noise threshold. In particular, a network with P=4 is able to retrieve information whose intensity is O(1) even in the presence of a noise O(sqrt[N]) in the large N limit. This striking skill stems from a redundancy representation of patterns-which is afforded given the (relatively) low-load information storage-and it contributes to explain the impressive abilities in pattern recognition exhibited by new-generation neural networks. The whole theory is developed rigorously, at the replica symmetric level of approximation, and corroborated by signal-to-noise analysis and Monte Carlo simulations.

摘要

我们考虑一个三层的 Sejnowski 机器,并表明通过对比散度学习到的特征具有作为密集联想记忆中模式的对偶表示,其阶数 P=4。后者已知能够通过海伯存储与构成二进制神经元的数量 N^{P-1}成比例的模式数量,其中 N 表示相互作用 P 次的二进制神经元的数量。我们还证明,通过使密集联想网络远离饱和状态(即,允许模式的数量仅与 N 线性缩放,而 P>2),这样的系统能够在远低于标准信噪比阈值的情况下进行模式识别。特别是,一个具有 P=4 的网络能够在大 N 极限下,即使在噪声 O(sqrt[N])的情况下,也能够检索 O(1)强度的信息。这种惊人的技能源于模式的冗余表示,这是由于(相对)低负载信息存储所提供的,它有助于解释新一代神经网络在模式识别方面表现出的令人印象深刻的能力。整个理论是在 replica 对称近似水平上严格发展的,并通过信噪比分析和蒙特卡罗模拟得到证实。

相似文献

1
Neural Networks with a Redundant Representation: Detecting the Undetectable.具有冗余表示的神经网络:检测不可检测的。
Phys Rev Lett. 2020 Jan 17;124(2):028301. doi: 10.1103/PhysRevLett.124.028301.
2
Hierarchical neural networks perform both serial and parallel processing.分层神经网络同时进行串行和并行处理。
Neural Netw. 2015 Jun;66:22-35. doi: 10.1016/j.neunet.2015.02.010. Epub 2015 Mar 2.
3
Dreaming neural networks: Forgetting spurious memories and reinforcing pure ones.梦境神经网络:忘记虚假记忆,强化真实记忆。
Neural Netw. 2019 Apr;112:24-40. doi: 10.1016/j.neunet.2019.01.006. Epub 2019 Jan 29.
4
Multitasking associative networks.多重任务关联网络。
Phys Rev Lett. 2012 Dec 28;109(26):268101. doi: 10.1103/PhysRevLett.109.268101. Epub 2012 Dec 26.
5
Deep associative neural network for associative memory based on unsupervised representation learning.基于无监督表示学习的联想记忆深度联想神经网络。
Neural Netw. 2019 May;113:41-53. doi: 10.1016/j.neunet.2019.01.004. Epub 2019 Feb 1.
6
Emergence of low noise frustrated states in E/I balanced neural networks.E/I 平衡神经网络中低噪声受挫状态的出现。
Neural Netw. 2016 Dec;84:91-101. doi: 10.1016/j.neunet.2016.08.010. Epub 2016 Sep 8.
7
Parallel retrieval of correlated patterns: from Hopfield networks to Boltzmann machines.并行关联模式检索:从霍普菲尔德网络到玻尔兹曼机。
Neural Netw. 2013 Feb;38:52-63. doi: 10.1016/j.neunet.2012.11.010. Epub 2012 Nov 28.
8
Multitasking attractor networks with neuronal threshold noise.具有神经元阈值噪声的多重任务吸引子网络。
Neural Netw. 2014 Jan;49:19-29. doi: 10.1016/j.neunet.2013.09.008. Epub 2013 Sep 30.
9
Generalized Guerra's interpolation schemes for dense associative neural networks.广义 Guerra 插值方案在密集型联想神经网络中的应用。
Neural Netw. 2020 Aug;128:254-267. doi: 10.1016/j.neunet.2020.05.009. Epub 2020 May 20.
10
Joining distributed pattern processing and homeostatic plasticity in recurrent on-center off-surround shunting networks: noise, saturation, short-term memory, synaptic scaling, and BDNF.在递归同中心异侧抑制分流网络中结合分布式模式处理和内稳态可塑性:噪声、饱和、短期记忆、突触缩放和 BDNF。
Neural Netw. 2012 Jan;25(1):21-9. doi: 10.1016/j.neunet.2011.07.009. Epub 2011 Aug 12.

引用本文的文献

1
Boltzmann Machines as Generalized Hopfield Networks: A Review of Recent Results and Outlooks.作为广义霍普菲尔德网络的玻尔兹曼机:近期成果与展望综述
Entropy (Basel). 2020 Dec 29;23(1):34. doi: 10.3390/e23010034.