• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过预测编码实现的联想记忆。

Associative Memories via Predictive Coding.

作者信息

Salvatori Tommaso, Song Yuhang, Hong Yujian, Sha Lei, Frieder Simon, Xu Zhenghua, Bogacz Rafal, Lukasiewicz Thomas

机构信息

Department of Computer Science, University of Oxford, UK.

MRC Brain Network Dynamics Unit, University of Oxford, UK.

出版信息

Adv Neural Inf Process Syst. 2021 Dec 1;34:3874-3886.

PMID:35664437
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7612799/
Abstract

Associative memories in the brain receive and store patterns of activity registered by the sensory neurons, and are able to retrieve them when necessary. Due to their importance in human intelligence, computational models of associative memories have been developed for several decades now. In this paper, we present a novel neural model for realizing associative memories, which is based on a hierarchical generative network that receives external stimuli via sensory neurons. It is trained using predictive coding, an error-based learning algorithm inspired by information processing in the cortex. To test the model's capabilities, we perform multiple retrieval experiments from both corrupted and incomplete data points. In an extensive comparison, we show that this new model outperforms in retrieval accuracy and robustness popular associative memory models, such as autoencoders trained via backpropagation, and modern Hopfield networks. In particular, in completing partial data points, our model achieves remarkable results on natural image datasets, such as ImageNet, with a surprisingly high accuracy, even when only a tiny fraction of pixels of the original images is presented. Our model provides a plausible framework to study learning and retrieval of memories in the brain, as it closely mimics the behavior of the hippocampus as a memory index and generative model.

摘要

大脑中的联想记忆接收并存储由感觉神经元记录的活动模式,并能够在必要时检索它们。由于它们在人类智能中的重要性,联想记忆的计算模型已经发展了几十年。在本文中,我们提出了一种用于实现联想记忆的新型神经模型,它基于一个通过感觉神经元接收外部刺激的分层生成网络。它使用预测编码进行训练,这是一种受皮层信息处理启发的基于误差的学习算法。为了测试该模型的能力,我们从损坏和不完整的数据点进行了多次检索实验。在广泛的比较中,我们表明这个新模型在检索准确性和鲁棒性方面优于流行的联想记忆模型,如通过反向传播训练的自动编码器和现代霍普菲尔德网络。特别是,在完成部分数据点时,我们的模型在自然图像数据集(如图像网)上取得了显著成果,即使只呈现原始图像的极小部分像素,也能达到惊人的高精度。我们的模型提供了一个合理的框架来研究大脑中记忆的学习和检索,因为它紧密模仿了海马体作为记忆索引和生成模型的行为。

相似文献

1
Associative Memories via Predictive Coding.通过预测编码实现的联想记忆。
Adv Neural Inf Process Syst. 2021 Dec 1;34:3874-3886.
2
Short-Term Memory Impairment短期记忆障碍
3
In Search of Dispersed Memories: Generative Diffusion Models Are Associative Memory Networks.寻找分散记忆:生成扩散模型是联想记忆网络。
Entropy (Basel). 2024 Apr 29;26(5):381. doi: 10.3390/e26050381.
4
Neural learning rules for generating flexible predictions and computing the successor representation.用于生成灵活预测和计算后继表示的神经学习规则。
Elife. 2023 Mar 16;12:e80680. doi: 10.7554/eLife.80680.
5
A neural network model of when to retrieve and encode episodic memories.一个关于何时提取和编码情景记忆的神经网络模型。
Elife. 2022 Feb 10;11:e74445. doi: 10.7554/eLife.74445.
6
Learning associative memories by error backpropagation.通过误差反向传播学习关联记忆。
IEEE Trans Neural Netw. 2011 Mar;22(3):347-55. doi: 10.1109/TNN.2010.2099239. Epub 2010 Dec 23.
7
Deep associative neural network for associative memory based on unsupervised representation learning.基于无监督表示学习的联想记忆深度联想神经网络。
Neural Netw. 2019 May;113:41-53. doi: 10.1016/j.neunet.2019.01.004. Epub 2019 Feb 1.
8
Performance of a Computational Model of the Mammalian Olfactory System哺乳动物嗅觉系统计算模型的性能
9
The memory systems of the human brain and generative artificial intelligence.人类大脑的记忆系统与生成式人工智能。
Heliyon. 2024 May 24;10(11):e31965. doi: 10.1016/j.heliyon.2024.e31965. eCollection 2024 Jun 15.
10
Input-driven dynamics for robust memory retrieval in Hopfield networks.霍普菲尔德网络中用于稳健记忆检索的输入驱动动力学。
Sci Adv. 2025 Apr 25;11(17):eadu6991. doi: 10.1126/sciadv.adu6991. Epub 2025 Apr 23.

引用本文的文献

1
Predictive Coding Model Detects Novelty on Different Levels of Representation Hierarchy.预测编码模型在不同层次的表征层级上检测新颖性。
Neural Comput. 2025 Jul 17;37(8):1373-1408. doi: 10.1162/neco_a_01769.
2
Inspires effective alternatives to backpropagation: predictive coding helps understand and build learning.激发了反向传播的有效替代方法:预测编码有助于理解和构建学习。
Neural Regen Res. 2025 Nov 1;20(11):3215-3216. doi: 10.4103/NRR.NRR-D-24-00629. Epub 2024 Oct 22.
3
Learning probability distributions of sensory inputs with Monte Carlo predictive coding.

本文引用的文献

1
Predictive Coding Approximates Backprop Along Arbitrary Computation Graphs.预测编码可沿任意计算图逼近反向传播。
Neural Comput. 2022 May 19;34(6):1329-1368. doi: 10.1162/neco_a_01497.
2
The neural coding framework for learning generative models.用于学习生成模型的神经编码框架。
Nat Commun. 2022 Apr 19;13(1):2064. doi: 10.1038/s41467-022-29632-7.
3
Can the Brain Do Backpropagation? -Exact Implementation of Backpropagation in Predictive Coding Networks.大脑能进行反向传播吗?——预测编码网络中反向传播的精确实现。
用蒙特卡罗预测编码学习感觉输入的概率分布。
PLoS Comput Biol. 2024 Oct 30;20(10):e1012532. doi: 10.1371/journal.pcbi.1012532. eCollection 2024 Oct.
4
In Search of Dispersed Memories: Generative Diffusion Models Are Associative Memory Networks.寻找分散记忆:生成扩散模型是联想记忆网络。
Entropy (Basel). 2024 Apr 29;26(5):381. doi: 10.3390/e26050381.
5
A sparse quantized hopfield network for online-continual memory.用于在线连续记忆的稀疏量化 Hopfield 网络。
Nat Commun. 2024 May 2;15(1):3722. doi: 10.1038/s41467-024-46976-4.
6
Predictive coding with spiking neurons and feedforward gist signaling.基于脉冲神经元和前馈主旨信号的预测编码。
Front Comput Neurosci. 2024 Apr 12;18:1338280. doi: 10.3389/fncom.2024.1338280. eCollection 2024.
7
Sequential Memory with Temporal Predictive Coding.具有时间预测编码的序列记忆
Adv Neural Inf Process Syst. 2023;36:44341-44355.
8
Fast adaptation to rule switching using neuronal surprise.利用神经元惊讶实现快速规则切换适应。
PLoS Comput Biol. 2024 Feb 20;20(2):e1011839. doi: 10.1371/journal.pcbi.1011839. eCollection 2024 Feb.
9
Dynamic predictive coding: A model of hierarchical sequence learning and prediction in the neocortex.动态预测编码:新皮层中分层序列学习和预测的模型。
PLoS Comput Biol. 2024 Feb 8;20(2):e1011801. doi: 10.1371/journal.pcbi.1011801. eCollection 2024 Feb.
10
The storage capacity of a directed graph and nodewise autonomous, ubiquitous learning.有向图的存储容量与节点自主、普适学习。
Front Comput Neurosci. 2023 Oct 19;17:1254355. doi: 10.3389/fncom.2023.1254355. eCollection 2023.
Adv Neural Inf Process Syst. 2020;33:22566-22579.
4
Overparameterized neural networks implement associative memory.过参数化神经网络实现了联想记忆。
Proc Natl Acad Sci U S A. 2020 Nov 3;117(44):27162-27170. doi: 10.1073/pnas.2005013117. Epub 2020 Oct 16.
5
Prediction and memory: A predictive coding account.预测与记忆:预测编码观点。
Prog Neurobiol. 2020 Sep;192:101821. doi: 10.1016/j.pneurobio.2020.101821. Epub 2020 May 21.
6
Unsupervised learning by competing hidden units.无监督竞争型隐单元学习。
Proc Natl Acad Sci U S A. 2019 Apr 16;116(16):7723-7731. doi: 10.1073/pnas.1820458116. Epub 2019 Mar 29.
7
Deep associative neural network for associative memory based on unsupervised representation learning.基于无监督表示学习的联想记忆深度联想神经网络。
Neural Netw. 2019 May;113:41-53. doi: 10.1016/j.neunet.2019.01.004. Epub 2019 Feb 1.
8
Theories of Error Back-Propagation in the Brain.大脑中的误差反向传播理论。
Trends Cogn Sci. 2019 Mar;23(3):235-250. doi: 10.1016/j.tics.2018.12.005. Epub 2019 Jan 28.
9
Generative Predictive Codes by Multiplexed Hippocampal Neuronal Tuplets.通过多路复用海马神经元三聚体生成预测性代码。
Neuron. 2018 Sep 19;99(6):1329-1341.e6. doi: 10.1016/j.neuron.2018.07.047. Epub 2018 Aug 23.
10
The hippocampus as a predictive map.海马体作为一个预测图。
Nat Neurosci. 2017 Nov;20(11):1643-1653. doi: 10.1038/nn.4650. Epub 2017 Oct 2.