• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

相似文献

1
Attractor Dynamics in Networks with Learning Rules Inferred from In Vivo Data.从活体数据中推断出学习规则的网络中的吸引子动力学。
Neuron. 2018 Jul 11;99(1):227-238.e4. doi: 10.1016/j.neuron.2018.05.038. Epub 2018 Jun 14.
2
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
3
Hebbian learning of context in recurrent neural networks.循环神经网络中上下文的赫布学习
Neural Comput. 1996 Nov 15;8(8):1677-710. doi: 10.1162/neco.1996.8.8.1677.
4
Inferring learning rules from distributions of firing rates in cortical neurons.从皮层神经元放电率分布推断学习规则。
Nat Neurosci. 2015 Dec;18(12):1804-10. doi: 10.1038/nn.4158. Epub 2015 Nov 2.
5
Memory dynamics in attractor networks with saliency weights.吸引子网络中的记忆动力学与显著权重。
Neural Comput. 2010 Jul;22(7):1899-926. doi: 10.1162/neco.2010.07-09-1050.
6
Computational modeling of pair-association memory in inferior temporal cortex.颞下回中配对联想记忆的计算建模
Brain Res Cogn Brain Res. 2002 Apr;13(2):169-78. doi: 10.1016/s0926-6410(01)00109-4.
7
Characteristics of sequential activity in networks with temporally asymmetric Hebbian learning.具有时间不对称赫布学习的网络中的序列活动特征。
Proc Natl Acad Sci U S A. 2020 Nov 24;117(47):29948-29958. doi: 10.1073/pnas.1918674117. Epub 2020 Nov 11.
8
A balanced memory network.一个平衡记忆网络。
PLoS Comput Biol. 2007 Sep;3(9):1679-700. doi: 10.1371/journal.pcbi.0030141. Epub 2007 Jun 5.
9
Probabilistic associative learning suffices for learning the temporal structure of multiple sequences.概率联想学习足以学习多个序列的时间结构。
PLoS One. 2019 Aug 1;14(8):e0220161. doi: 10.1371/journal.pone.0220161. eCollection 2019.
10
A Gaussian attractor network for memory and recognition with experience-dependent learning.一个具有经验依赖学习的记忆和识别的高斯吸引器网络。
Neural Comput. 2010 May;22(5):1333-57. doi: 10.1162/neco.2010.02-09-957.

引用本文的文献

1
Stochastic activity in low-rank recurrent neural networks.低秩递归神经网络中的随机活动。
PLoS Comput Biol. 2025 Aug 18;21(8):e1013371. doi: 10.1371/journal.pcbi.1013371.
2
Stochastic activity in low-rank recurrent neural networks.低秩递归神经网络中的随机活动。
bioRxiv. 2025 Jul 11:2025.04.22.649933. doi: 10.1101/2025.04.22.649933.
3
Representational learning by optimization of neural manifolds in an olfactory memory network.通过优化嗅觉记忆网络中的神经流形进行表征学习。
Res Sq. 2025 Mar 26:rs.3.rs-6155477. doi: 10.21203/rs.3.rs-6155477/v1.
4
Representational learning by optimization of neural manifolds in an olfactory memory network.通过优化嗅觉记忆网络中的神经流形进行表征学习。
bioRxiv. 2024 Nov 18:2024.11.17.623906. doi: 10.1101/2024.11.17.623906.
5
Dynamic control of sequential retrieval speed in networks with heterogeneous learning rules.具有异质学习规则的网络中顺序检索速度的动态控制。
Elife. 2024 Aug 28;12:RP88805. doi: 10.7554/eLife.88805.
6
Spiking attractor model of motor cortex explains modulation of neural and behavioral variability by prior target information.运动皮层尖峰吸引子模型解释了先前目标信息对神经和行为变异性的调制。
Nat Commun. 2024 Jul 26;15(1):6304. doi: 10.1038/s41467-024-49889-4.
7
A Computational Framework for Memory Engrams.记忆印痕的计算框架。
Adv Neurobiol. 2024;38:237-257. doi: 10.1007/978-3-031-62983-9_13.
8
Temporal multiplexing of perception and memory codes in IT cortex.颞叶皮层中感知和记忆代码的时间复用。
Nature. 2024 May;629(8013):861-868. doi: 10.1038/s41586-024-07349-5. Epub 2024 May 15.
9
Engram mechanisms of memory linking and identity.记忆连接和身份的记忆痕迹机制。
Nat Rev Neurosci. 2024 Jun;25(6):375-392. doi: 10.1038/s41583-024-00814-0. Epub 2024 Apr 25.
10
Brain mechanism of foraging: Reward-dependent synaptic plasticity versus neural integration of values.觅食的大脑机制:奖赏依赖型突触可塑性与价值的神经整合。
Proc Natl Acad Sci U S A. 2024 Apr 2;121(14):e2318521121. doi: 10.1073/pnas.2318521121. Epub 2024 Mar 29.

本文引用的文献

1
Discrete attractor dynamics underlies persistent activity in the frontal cortex.离散吸引子动力学是额叶皮层持续活动的基础。
Nature. 2019 Feb;566(7743):212-217. doi: 10.1038/s41586-019-0919-7. Epub 2019 Feb 6.
2
Stable population coding for working memory coexists with heterogeneous neural dynamics in prefrontal cortex.工作记忆的稳定群体编码与前额叶皮层中异质性神经动力学共存。
Proc Natl Acad Sci U S A. 2017 Jan 10;114(2):394-399. doi: 10.1073/pnas.1619449114. Epub 2016 Dec 27.
3
Demixed principal component analysis of neural population data.神经群体数据的混合主成分分析
Elife. 2016 Apr 12;5:e10989. doi: 10.7554/eLife.10989.
4
Is cortical connectivity optimized for storing information?皮层连接是否优化用于存储信息?
Nat Neurosci. 2016 May;19(5):749-755. doi: 10.1038/nn.4286. Epub 2016 Apr 11.
5
Inferring learning rules from distributions of firing rates in cortical neurons.从皮层神经元放电率分布推断学习规则。
Nat Neurosci. 2015 Dec;18(12):1804-10. doi: 10.1038/nn.4158. Epub 2015 Nov 2.
6
Distinct recurrent versus afferent dynamics in cortical visual processing.皮层视觉处理中不同的传出与传入动态。
Nat Neurosci. 2015 Dec;18(12):1789-97. doi: 10.1038/nn.4153. Epub 2015 Oct 26.
7
Asynchronous Rate Chaos in Spiking Neuronal Circuits.脉冲神经元回路中的异步速率混沌
PLoS Comput Biol. 2015 Jul 31;11(7):e1004266. doi: 10.1371/journal.pcbi.1004266. eCollection 2015 Jul.
8
Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks.多种突触可塑性机制协同作用,在脉冲神经网络中形成和检索记忆。
Nat Commun. 2015 Apr 21;6:6922. doi: 10.1038/ncomms7922.
9
Formation and maintenance of neuronal assemblies through synaptic plasticity.通过突触可塑性形成和维持神经元集合。
Nat Commun. 2014 Nov 14;5:5319. doi: 10.1038/ncomms6319.
10
Modeling the dynamic interaction of Hebbian and homeostatic plasticity.模拟赫布可塑性和稳态可塑性的动态相互作用。
Neuron. 2014 Oct 22;84(2):497-510. doi: 10.1016/j.neuron.2014.09.036.

从活体数据中推断出学习规则的网络中的吸引子动力学。

Attractor Dynamics in Networks with Learning Rules Inferred from In Vivo Data.

机构信息

Department of Statistics, The University of Chicago, Chicago, IL 60637, USA.

Department of Statistics, The University of Chicago, Chicago, IL 60637, USA; Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA; Department of Neurobiology, Duke University, Durham, NC 27710, USA; Department of Physics, Duke University, Durham, NC 27708, USA.

出版信息

Neuron. 2018 Jul 11;99(1):227-238.e4. doi: 10.1016/j.neuron.2018.05.038. Epub 2018 Jun 14.

DOI:10.1016/j.neuron.2018.05.038
PMID:29909997
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6091895/
Abstract

The attractor neural network scenario is a popular scenario for memory storage in the association cortex, but there is still a large gap between models based on this scenario and experimental data. We study a recurrent network model in which both learning rules and distribution of stored patterns are inferred from distributions of visual responses for novel and familiar images in the inferior temporal cortex (ITC). Unlike classical attractor neural network models, our model exhibits graded activity in retrieval states, with distributions of firing rates that are close to lognormal. Inferred learning rules are close to maximizing the number of stored patterns within a family of unsupervised Hebbian learning rules, suggesting that learning rules in ITC are optimized to store a large number of attractor states. Finally, we show that there exist two types of retrieval states: one in which firing rates are constant in time and another in which firing rates fluctuate chaotically.

摘要

吸引子神经网络场景是联合皮层中存储记忆的一种流行场景,但基于该场景的模型与实验数据之间仍然存在很大差距。我们研究了一个递归网络模型,其中学习规则和存储模式的分布都是从下颞叶皮层(ITC)中对新的和熟悉的图像的视觉反应分布中推断出来的。与经典的吸引子神经网络模型不同,我们的模型在检索状态下表现出渐变的活动,其发射率分布接近对数正态分布。推断出的学习规则接近于最大化无监督海伯学习规则族内存储的模式数量,这表明 ITC 中的学习规则经过了优化,可以存储大量的吸引子状态。最后,我们表明存在两种检索状态:一种是发射率随时间保持不变,另一种是发射率随机波动。