• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

梦境神经网络:忘记虚假记忆,强化真实记忆。

Dreaming neural networks: Forgetting spurious memories and reinforcing pure ones.

机构信息

Dipartimento di Matematica e Fisica Ennio De Giorgi, Università del Salento, Italy; GNFM-INdAM Sezione di Lecce, Italy; INFN, Istituto Nazionale di Fisica Nucleare, Sezione di Lecce, Italy.

Dipartimento di Matematica, Sapienza Università di Roma, Italy; GNFM-INdAM Sezione di Roma, Italy.

出版信息

Neural Netw. 2019 Apr;112:24-40. doi: 10.1016/j.neunet.2019.01.006. Epub 2019 Jan 29.

DOI:10.1016/j.neunet.2019.01.006
PMID:30735914
Abstract

The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is α∼0.14, far from the theoretical bound for symmetric networks, i.e. α=1. Inspired by sleeping and dreaming mechanisms in mammal brains, we propose an extension of this model displaying the standard on-line (awake) learning mechanism (that allows the storage of external information in terms of patterns) and an off-line (sleep) unlearning&consolidating mechanism (that allows spurious-pattern removal and pure-pattern reinforcement): this obtained daily prescription is able to saturate the theoretical bound α=1, remaining also extremely robust against thermal noise. The emergent neural and synaptic features are analyzed both analytically and numerically. In particular, beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix. We analytically prove that our algorithm makes the Hebbian kernel converge with high probability to the projection matrix built over the pure stored patterns. Furthermore, we obtain a sharp and explicit estimate for the "sleep rate" in order to ensure such a convergence. Finally, we run extensive numerical simulations (mainly Monte Carlo sampling) to check the approximations underlying the analytical investigations (e.g., we developed the whole theory at the so called replica-symmetric level, as standard in the Amit-Gutfreund-Sompolinsky reference framework) and possible finite-size effects, finding overall full agreement with the theory.

摘要

关联神经网络的标准 Hopfield 模型解释了生物海伯学习,并充当模式识别的谐振器,但其最大存储容量为α∼0.14,远低于对称网络的理论极限,即α=1。受哺乳动物大脑中睡眠和做梦机制的启发,我们提出了对该模型的扩展,该模型显示了标准的在线(清醒)学习机制(允许以模式的形式存储外部信息)和离线(睡眠)遗忘和巩固机制(允许去除虚假模式和增强真实模式):这种日常的处方能够使理论极限α=1 饱和,并且对热噪声仍然具有极强的鲁棒性。我们对新兴的神经和突触特征进行了分析。特别是,除了获得神经动力学的相图外,我们还关注突触可塑性,并给出了关于突触矩阵时间演化的显式处方。我们从理论上证明,我们的算法使得海伯核以高概率收敛于构建在纯存储模式上的投影矩阵。此外,我们还获得了“睡眠率”的精确和显式估计,以确保这种收敛。最后,我们进行了广泛的数值模拟(主要是蒙特卡罗抽样)来检查分析研究的近似值(例如,我们在 Amit-Gutfreund-Sompolinsky 参考框架中标准的所谓复制对称水平上发展了整个理论)和可能的有限尺寸效应,发现总体上与理论完全一致。

相似文献

1
Dreaming neural networks: Forgetting spurious memories and reinforcing pure ones.梦境神经网络:忘记虚假记忆,强化真实记忆。
Neural Netw. 2019 Apr;112:24-40. doi: 10.1016/j.neunet.2019.01.006. Epub 2019 Jan 29.
2
Multitasking associative networks.多重任务关联网络。
Phys Rev Lett. 2012 Dec 28;109(26):268101. doi: 10.1103/PhysRevLett.109.268101. Epub 2012 Dec 26.
3
A new mechanical approach to handle generalized Hopfield neural networks.一种处理广义 Hopfield 神经网络的新机械方法。
Neural Netw. 2018 Oct;106:205-222. doi: 10.1016/j.neunet.2018.07.010. Epub 2018 Jul 21.
4
Hebbian dreaming for small datasets.针对小数据集的赫布式梦境。
Neural Netw. 2024 May;173:106174. doi: 10.1016/j.neunet.2024.106174. Epub 2024 Feb 12.
5
Memory dynamics in attractor networks with saliency weights.吸引子网络中的记忆动力学与显著权重。
Neural Comput. 2010 Jul;22(7):1899-926. doi: 10.1162/neco.2010.07-09-1050.
6
Multitasking attractor networks with neuronal threshold noise.具有神经元阈值噪声的多重任务吸引子网络。
Neural Netw. 2014 Jan;49:19-29. doi: 10.1016/j.neunet.2013.09.008. Epub 2013 Sep 30.
7
Emergence of low noise frustrated states in E/I balanced neural networks.E/I 平衡神经网络中低噪声受挫状态的出现。
Neural Netw. 2016 Dec;84:91-101. doi: 10.1016/j.neunet.2016.08.010. Epub 2016 Sep 8.
8
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
9
A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.一种三阈值学习规则接近循环神经网络的最大容量。
PLoS Comput Biol. 2015 Aug 20;11(8):e1004439. doi: 10.1371/journal.pcbi.1004439. eCollection 2015 Aug.
10
Supervised perceptron learning vs unsupervised Hebbian unlearning: Approaching optimal memory retrieval in Hopfield-like networks.监督感知机学习与无监督海伯无学习:在类似 Hopfield 的网络中接近最佳记忆检索。
J Chem Phys. 2022 Mar 14;156(10):104107. doi: 10.1063/5.0084219.

引用本文的文献

1
Why do we think? The dynamics of spontaneous thought reveal its functions.我们为什么思考?自发思维的动态过程揭示了其功能。
PNAS Nexus. 2024 Jun 12;3(6):pgae230. doi: 10.1093/pnasnexus/pgae230. eCollection 2024 Jun.
2
Thalamo-cortical spiking model of incremental learning combining perception, context and NREM-sleep.丘脑-皮层尖峰模型的增量学习,结合了感知、上下文和非快速眼动睡眠。
PLoS Comput Biol. 2021 Jun 28;17(6):e1009045. doi: 10.1371/journal.pcbi.1009045. eCollection 2021 Jun.
3
Unsupervised Learning Facilitates Neural Coordination Across the Functional Clusters of the Connectome.
无监督学习促进了连接组功能簇间的神经协调。
Front Robot AI. 2020 Apr 2;7:40. doi: 10.3389/frobt.2020.00040. eCollection 2020.
4
Boltzmann Machines as Generalized Hopfield Networks: A Review of Recent Results and Outlooks.作为广义霍普菲尔德网络的玻尔兹曼机:近期成果与展望综述
Entropy (Basel). 2020 Dec 29;23(1):34. doi: 10.3390/e23010034.
5
Can sleep protect memories from catastrophic forgetting?睡眠能保护记忆免受灾难性遗忘吗?
Elife. 2020 Aug 4;9:e51005. doi: 10.7554/eLife.51005.