Suppr超能文献

监督感知机学习与无监督海伯无学习:在类似 Hopfield 的网络中接近最佳记忆检索。

Supervised perceptron learning vs unsupervised Hebbian unlearning: Approaching optimal memory retrieval in Hopfield-like networks.

机构信息

Dipartimento di Fisica, Sapienza Università di Roma, P.le A. Moro 2, 00185 Roma, Italy.

Laboratoire de Physique de l'Ecole Normale Supérieure, ENS, Université PSL, CNRS, Sorbonne Université, Université de Paris, F-75005 Paris, France.

出版信息

J Chem Phys. 2022 Mar 14;156(10):104107. doi: 10.1063/5.0084219.

Abstract

The Hebbian unlearning algorithm, i.e., an unsupervised local procedure used to improve the retrieval properties in Hopfield-like neural networks, is numerically compared to a supervised algorithm to train a linear symmetric perceptron. We analyze the stability of the stored memories: basins of attraction obtained by the Hebbian unlearning technique are found to be comparable in size to those obtained in the symmetric perceptron, while the two algorithms are found to converge in the same region of Gardner's space of interactions, having followed similar learning paths. A geometric interpretation of Hebbian unlearning is proposed to explain its optimal performances. Because the Hopfield model is also a prototypical model of the disordered magnetic system, it might be possible to translate our results to other models of interest for memory storage in materials.

摘要

海布无学习算法,即一种用于改进类似霍普菲尔德神经网络检索性能的无监督局部过程,被数值上与一种用于训练线性对称感知器的监督算法进行了比较。我们分析了存储记忆的稳定性:通过海布无学习技术获得的吸引盆地的大小被发现与对称感知器获得的吸引盆地相当,而这两种算法被发现收敛于相同的加德纳相互作用空间区域,遵循相似的学习路径。提出了海布无学习的几何解释,以解释其最佳性能。由于霍普菲尔德模型也是无序磁系统的典型模型,因此我们的结果可能可以转化为其他对材料存储有兴趣的模型。

相似文献

2
Dreaming neural networks: Forgetting spurious memories and reinforcing pure ones.
Neural Netw. 2019 Apr;112:24-40. doi: 10.1016/j.neunet.2019.01.006. Epub 2019 Jan 29.
3
A new mechanical approach to handle generalized Hopfield neural networks.
Neural Netw. 2018 Oct;106:205-222. doi: 10.1016/j.neunet.2018.07.010. Epub 2018 Jul 21.
4
Designing asymmetric Hopfield-type associative memory with higher order hamming stability.
IEEE Trans Neural Netw. 2005 Nov;16(6):1464-76. doi: 10.1109/TNN.2005.852863.
5
Prototype Analysis in Hopfield Networks With Hebbian Learning.
Neural Comput. 2024 Oct 11;36(11):2322-2364. doi: 10.1162/neco_a_01704.
6
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks.
Entropy (Basel). 2019 Jul 25;21(8):726. doi: 10.3390/e21080726.
7
Hebbian learning in parallel and modular memories.
Biol Cybern. 1998 Feb;78(2):79-86. doi: 10.1007/s004220050415.
8
"Unlearning" increases the storage capacity of content addressable memories.
Biophys J. 1987 Jan;51(1):47-53. doi: 10.1016/S0006-3495(87)83310-6.
9
Bistable gradient networks. I. Attractors and pattern retrieval at low loading in the thermodynamic limit.
Phys Rev E Stat Nonlin Soft Matter Phys. 2003 Jan;67(1 Pt 2):016118. doi: 10.1103/PhysRevE.67.016118. Epub 2003 Jan 30.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验