• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

霍普菲尔德网络中的稳健指数记忆

Robust Exponential Memory in Hopfield Networks.

作者信息

Hillar Christopher J, Tran Ngoc M

机构信息

Redwood Center for Theoretical Neuroscience, Berkeley, CA, USA.

University of Texas, Austin, Austin, TX, USA.

出版信息

J Math Neurosci. 2018 Jan 16;8(1):1. doi: 10.1186/s13408-017-0056-2.

DOI:10.1186/s13408-017-0056-2
PMID:29340803
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC5770423/
Abstract

The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored the potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density error-correcting codes that achieve Shannon's noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology.

摘要

霍普菲尔德递归神经网络是一种经典的记忆自联想模型,其中对称耦合的麦卡洛克 - 皮茨二元神经元集合相互作用以执行涌现计算。尽管先前的研究人员已经探索了该网络解决组合优化问题或将重复活动模式作为其确定性动力学吸引子进行存储的潜力,但一个基本的开放性问题是设计一族霍普菲尔德网络,其具有随神经元数量呈指数增长的若干容错记忆。在此,我们通过最小化概率流发现了此类网络,概率流是最近提出的用于估计离散最大熵模型参数的目标。通过沿凸概率流的梯度下降,我们的网络调整突触权重以实现稳健的指数存储,即使在训练模式数量极少的情况下也是如此。除了提供一组新的达到香农噪声信道容量界限的低密度纠错码外,这些网络还能有效解决计算机科学中隐藏团问题的一个变体,为源自生物学的计算模型在现实世界中的应用开辟了新途径。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/7350aabf7d94/13408_2017_56_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/1727ffe92f32/13408_2017_56_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/0df6065597f5/13408_2017_56_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/bc9c263112c1/13408_2017_56_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/dfc0bb166c9d/13408_2017_56_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/98c51f3ce1f2/13408_2017_56_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/3e96df67579c/13408_2017_56_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/7350aabf7d94/13408_2017_56_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/1727ffe92f32/13408_2017_56_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/0df6065597f5/13408_2017_56_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/bc9c263112c1/13408_2017_56_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/dfc0bb166c9d/13408_2017_56_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/98c51f3ce1f2/13408_2017_56_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/3e96df67579c/13408_2017_56_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a003/5770423/7350aabf7d94/13408_2017_56_Fig7_HTML.jpg

相似文献

1
Robust Exponential Memory in Hopfield Networks.霍普菲尔德网络中的稳健指数记忆
J Math Neurosci. 2018 Jan 16;8(1):1. doi: 10.1186/s13408-017-0056-2.
2
Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks.隐藏超图、纠错码与霍普菲尔德网络中的临界学习
Entropy (Basel). 2021 Nov 11;23(11):1494. doi: 10.3390/e23111494.
3
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks.超越霍普菲尔德递归神经网络中的最大存储容量限制。
Entropy (Basel). 2019 Jul 25;21(8):726. doi: 10.3390/e21080726.
4
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
5
Design and analysis of maximum Hopfield networks.最大霍普菲尔德网络的设计与分析。
IEEE Trans Neural Netw. 2001;12(2):329-39. doi: 10.1109/72.914527.
6
Efficient Associative Computation with Discrete Synapses.基于离散突触的高效关联计算。
Neural Comput. 2016 Jan;28(1):118-86. doi: 10.1162/NECO_a_00795. Epub 2015 Nov 24.
7
An efficient approximation algorithm for finding a maximum clique using Hopfield network learning.一种使用霍普菲尔德网络学习来寻找最大团的高效近似算法。
Neural Comput. 2003 Jul;15(7):1605-19. doi: 10.1162/089976603321891828.
8
Memory dynamics in attractor networks with saliency weights.吸引子网络中的记忆动力学与显著权重。
Neural Comput. 2010 Jul;22(7):1899-926. doi: 10.1162/neco.2010.07-09-1050.
9
Learning associative memories by error backpropagation.通过误差反向传播学习关联记忆。
IEEE Trans Neural Netw. 2011 Mar;22(3):347-55. doi: 10.1109/TNN.2010.2099239. Epub 2010 Dec 23.
10
Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.具有无标度霍普菲尔德神经网络误差的增强存储容量:一项分析研究。
PLoS One. 2017 Oct 27;12(10):e0184683. doi: 10.1371/journal.pone.0184683. eCollection 2017.

引用本文的文献

1
Detecting Signatures of Criticality Using Divergence Rate.利用发散率检测临界性特征
Entropy (Basel). 2025 Apr 30;27(5):487. doi: 10.3390/e27050487.
2
Photonic Stochastic Emergent Storage for deep classification by scattering-intrinsic patterns.用于深度分类的基于散射固有模式的光子随机突发存储
Nat Commun. 2024 Jan 13;15(1):505. doi: 10.1038/s41467-023-44498-z.
3
Hidden Hypergraphs, Error-Correcting Codes, and Critical Learning in Hopfield Networks.隐藏超图、纠错码与霍普菲尔德网络中的临界学习

本文引用的文献

1
Computational principles of memory.记忆的计算原理。
Nat Neurosci. 2016 Mar;19(3):394-403. doi: 10.1038/nn.4237.
2
Efficient Associative Computation with Discrete Synapses.基于离散突触的高效关联计算。
Neural Comput. 2016 Jan;28(1):118-86. doi: 10.1162/NECO_a_00795. Epub 2015 Nov 24.
3
A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.一种三阈值学习规则接近循环神经网络的最大容量。
Entropy (Basel). 2021 Nov 11;23(11):1494. doi: 10.3390/e23111494.
PLoS Comput Biol. 2015 Aug 20;11(8):e1004439. doi: 10.1371/journal.pcbi.1004439. eCollection 2015 Aug.
4
Noise facilitation in associative memories of exponential capacity.指数容量关联记忆中的噪声促进作用。
Neural Comput. 2014 Nov;26(11):2493-526. doi: 10.1162/NECO_a_00655. Epub 2014 Aug 22.
5
Combinatorial neural codes from a mathematical coding theory perspective.从数学编码理论角度看组合神经码
Neural Comput. 2013 Jul;25(7):1891-925. doi: 10.1162/NECO_a_00459.
6
New method for parameter estimation in probabilistic models: minimum probability flow.概率模型中参数估计的新方法:最小概率流。
Phys Rev Lett. 2011 Nov 25;107(22):220601. doi: 10.1103/PhysRevLett.107.220601. Epub 2011 Nov 21.
7
Sparse neural networks with large learning diversity.具有大学习多样性的稀疏神经网络。
IEEE Trans Neural Netw. 2011 Jul;22(7):1087-96. doi: 10.1109/TNN.2011.2146789. Epub 2011 Jun 7.
8
Modulation of neuronal interactions through neuronal synchronization.通过神经元同步调节神经元相互作用。
Science. 2007 Jun 15;316(5831):1609-12. doi: 10.1126/science.1139597.
9
Neuronal synchrony: a versatile code for the definition of relations?神经元同步:一种用于定义关系的通用编码?
Neuron. 1999 Sep;24(1):49-65, 111-25. doi: 10.1016/s0896-6273(00)80821-1.
10
Synchronization of cortical activity and its putative role in information processing and learning.皮质活动的同步化及其在信息处理和学习中的假定作用。
Annu Rev Physiol. 1993;55:349-74. doi: 10.1146/annurev.ph.55.030193.002025.