Suppr超能文献

通用霍普菲尔德网络:单触发联想记忆模型的通用框架

Universal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models.

作者信息

Millidge Beren, Salvatori Tommaso, Song Yuhang, Lukasiewicz Thomas, Bogacz Rafal

机构信息

MRC Brain Network Dynamics Unit, University of Oxford, UK.

Department of Computer Science, University of Oxford, UK.

出版信息

Proc Mach Learn Res. 2022 Jul;162:15561-15583.

Abstract

A large number of neural network models of associative memory have been proposed in the literature. These include the classical Hopfield networks (HNs), sparse distributed memories (SDMs), and more recently the modern continuous Hopfield networks (MCHNs), which possess close links with self-attention in machine learning. In this paper, we propose a general framework for understanding the operation of such memory networks as a sequence of three operations: , , and . We derive all these memory models as instances of our general framework with differing similarity and separation functions. We extend the mathematical framework of Krotov & Hopfield (2020) to express general associative memory models using neural network dynamics with local computation, and derive a general energy function that is a Lyapunov function of the dynamics. Finally, using our framework, we empirically investigate the capacity of using different similarity functions for these associative memory models, beyond the dot product similarity measure, and demonstrate empirically that Euclidean or Manhattan distance similarity metrics perform substantially better in practice on many tasks, enabling a more robust retrieval and higher memory capacity than existing models.

摘要

文献中已经提出了大量用于联想记忆的神经网络模型。这些模型包括经典的霍普菲尔德网络(HNs)、稀疏分布式记忆(SDMs),以及最近的现代连续霍普菲尔德网络(MCHNs),后者与机器学习中的自注意力有着紧密联系。在本文中,我们提出了一个通用框架,用于将此类记忆网络的运作理解为三个操作的序列: 、 和 。我们将所有这些记忆模型推导为具有不同相似性和分离函数的通用框架实例。我们扩展了Krotov & Hopfield(2020)的数学框架,以使用具有局部计算的神经网络动力学来表达通用联想记忆模型,并推导了一个作为动力学李雅普诺夫函数的通用能量函数。最后,使用我们的框架,我们通过实证研究了除点积相似性度量之外,为这些联想记忆模型使用不同相似性函数的能力,并通过实证证明,欧几里得或曼哈顿距离相似性度量在许多任务的实际应用中表现得要好得多,与现有模型相比,能够实现更强大的检索和更高的记忆容量。

相似文献

2
In Search of Dispersed Memories: Generative Diffusion Models Are Associative Memory Networks.
Entropy (Basel). 2024 Apr 29;26(5):381. doi: 10.3390/e26050381.
3
Network Dynamics Governed by Lyapunov Functions: From Memory to Classification.
Trends Neurosci. 2020 Jul;43(7):453-455. doi: 10.1016/j.tins.2020.04.002. Epub 2020 May 5.
4
Input-driven dynamics for robust memory retrieval in Hopfield networks.
Sci Adv. 2025 Apr 25;11(17):eadu6991. doi: 10.1126/sciadv.adu6991. Epub 2025 Apr 23.
5
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks.
Entropy (Basel). 2019 Jul 25;21(8):726. doi: 10.3390/e21080726.
6
On stability and associative recall of memories in attractor neural networks.
PLoS One. 2020 Sep 17;15(9):e0238054. doi: 10.1371/journal.pone.0238054. eCollection 2020.
7
Associative memory realized by a reconfigurable memristive Hopfield neural network.
Nat Commun. 2015 Jun 25;6:7522. doi: 10.1038/ncomms8522.
8
Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study.
PLoS One. 2017 Oct 27;12(10):e0184683. doi: 10.1371/journal.pone.0184683. eCollection 2017.
10
Robust computation with rhythmic spike patterns.
Proc Natl Acad Sci U S A. 2019 Sep 3;116(36):18050-18059. doi: 10.1073/pnas.1902653116. Epub 2019 Aug 20.

引用本文的文献

1
A sparse quantized hopfield network for online-continual memory.
Nat Commun. 2024 May 2;15(1):3722. doi: 10.1038/s41467-024-46976-4.
2
Sequential Memory with Temporal Predictive Coding.
Adv Neural Inf Process Syst. 2023;36:44341-44355.
3
A generative model of memory construction and consolidation.
Nat Hum Behav. 2024 Mar;8(3):526-543. doi: 10.1038/s41562-023-01799-z. Epub 2024 Jan 19.

本文引用的文献

1
Associative Memories via Predictive Coding.
Adv Neural Inf Process Syst. 2021 Dec 1;34:3874-3886.
2
Rapid encoding of musical tones discovered in whole-brain connectivity.
Neuroimage. 2021 Dec 15;245:118735. doi: 10.1016/j.neuroimage.2021.118735. Epub 2021 Nov 20.
3
Overparameterized neural networks implement associative memory.
Proc Natl Acad Sci U S A. 2020 Nov 3;117(44):27162-27170. doi: 10.1073/pnas.2005013117. Epub 2020 Oct 16.
4
The mechanisms for pattern completion and pattern separation in the hippocampus.
Front Syst Neurosci. 2013 Oct 30;7:74. doi: 10.3389/fnsys.2013.00074.
5
The concave-convex procedure.
Neural Comput. 2003 Apr;15(4):915-36. doi: 10.1162/08997660360581958.
6
The asymptotic memory capacity of the generalized Hopfield network.
Neural Netw. 1999 Nov;12(9):1207-1212. doi: 10.1016/s0893-6080(99)00042-8.
7
Number of stable points for spin-glasses and neural networks of higher orders.
Phys Rev Lett. 1987 Mar 2;58(9):913-916. doi: 10.1103/PhysRevLett.58.913.
8
Storage capacity of generalized networks.
Phys Rev A Gen Phys. 1987 Nov 15;36(10):5091-5094. doi: 10.1103/physreva.36.5091.
9
Neural networks and physical systems with emergent collective computational abilities.
Proc Natl Acad Sci U S A. 1982 Apr;79(8):2554-8. doi: 10.1073/pnas.79.8.2554.
10
Neurons with graded response have collective computational properties like those of two-state neurons.
Proc Natl Acad Sci U S A. 1984 May;81(10):3088-92. doi: 10.1073/pnas.81.10.3088.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验