Suppr超能文献

从活体数据中推断出学习规则的网络中的吸引子动力学。

Attractor Dynamics in Networks with Learning Rules Inferred from In Vivo Data.

机构信息

Department of Statistics, The University of Chicago, Chicago, IL 60637, USA.

Department of Statistics, The University of Chicago, Chicago, IL 60637, USA; Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA; Department of Neurobiology, Duke University, Durham, NC 27710, USA; Department of Physics, Duke University, Durham, NC 27708, USA.

出版信息

Neuron. 2018 Jul 11;99(1):227-238.e4. doi: 10.1016/j.neuron.2018.05.038. Epub 2018 Jun 14.

Abstract

The attractor neural network scenario is a popular scenario for memory storage in the association cortex, but there is still a large gap between models based on this scenario and experimental data. We study a recurrent network model in which both learning rules and distribution of stored patterns are inferred from distributions of visual responses for novel and familiar images in the inferior temporal cortex (ITC). Unlike classical attractor neural network models, our model exhibits graded activity in retrieval states, with distributions of firing rates that are close to lognormal. Inferred learning rules are close to maximizing the number of stored patterns within a family of unsupervised Hebbian learning rules, suggesting that learning rules in ITC are optimized to store a large number of attractor states. Finally, we show that there exist two types of retrieval states: one in which firing rates are constant in time and another in which firing rates fluctuate chaotically.

摘要

吸引子神经网络场景是联合皮层中存储记忆的一种流行场景,但基于该场景的模型与实验数据之间仍然存在很大差距。我们研究了一个递归网络模型,其中学习规则和存储模式的分布都是从下颞叶皮层(ITC)中对新的和熟悉的图像的视觉反应分布中推断出来的。与经典的吸引子神经网络模型不同,我们的模型在检索状态下表现出渐变的活动,其发射率分布接近对数正态分布。推断出的学习规则接近于最大化无监督海伯学习规则族内存储的模式数量,这表明 ITC 中的学习规则经过了优化,可以存储大量的吸引子状态。最后,我们表明存在两种检索状态:一种是发射率随时间保持不变,另一种是发射率随机波动。

相似文献

3
Hebbian learning of context in recurrent neural networks.循环神经网络中上下文的赫布学习
Neural Comput. 1996 Nov 15;8(8):1677-710. doi: 10.1162/neco.1996.8.8.1677.
6
Computational modeling of pair-association memory in inferior temporal cortex.颞下回中配对联想记忆的计算建模
Brain Res Cogn Brain Res. 2002 Apr;13(2):169-78. doi: 10.1016/s0926-6410(01)00109-4.
8
A balanced memory network.一个平衡记忆网络。
PLoS Comput Biol. 2007 Sep;3(9):1679-700. doi: 10.1371/journal.pcbi.0030141. Epub 2007 Jun 5.

引用本文的文献

1
Stochastic activity in low-rank recurrent neural networks.低秩递归神经网络中的随机活动。
PLoS Comput Biol. 2025 Aug 18;21(8):e1013371. doi: 10.1371/journal.pcbi.1013371.
2
Stochastic activity in low-rank recurrent neural networks.低秩递归神经网络中的随机活动。
bioRxiv. 2025 Jul 11:2025.04.22.649933. doi: 10.1101/2025.04.22.649933.
7
A Computational Framework for Memory Engrams.记忆印痕的计算框架。
Adv Neurobiol. 2024;38:237-257. doi: 10.1007/978-3-031-62983-9_13.
8
Temporal multiplexing of perception and memory codes in IT cortex.颞叶皮层中感知和记忆代码的时间复用。
Nature. 2024 May;629(8013):861-868. doi: 10.1038/s41586-024-07349-5. Epub 2024 May 15.
9
Engram mechanisms of memory linking and identity.记忆连接和身份的记忆痕迹机制。
Nat Rev Neurosci. 2024 Jun;25(6):375-392. doi: 10.1038/s41583-024-00814-0. Epub 2024 Apr 25.

本文引用的文献

4
Is cortical connectivity optimized for storing information?皮层连接是否优化用于存储信息?
Nat Neurosci. 2016 May;19(5):749-755. doi: 10.1038/nn.4286. Epub 2016 Apr 11.
7
Asynchronous Rate Chaos in Spiking Neuronal Circuits.脉冲神经元回路中的异步速率混沌
PLoS Comput Biol. 2015 Jul 31;11(7):e1004266. doi: 10.1371/journal.pcbi.1004266. eCollection 2015 Jul.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验