• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

量化存储在突触连接而非神经网络放电模式中的信息。

Quantifying information stored in synaptic connections rather than in firing patterns of neural networks.

作者信息

Fan Xinhao, Mysore Shreesh P

机构信息

Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA.

Department of Psychological and Brain Sciences, Johns Hopkins University, Baltimore, MD, USA.

出版信息

ArXiv. 2024 Nov 26:arXiv:2411.17692v1.

PMID:39650602
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11623702/
Abstract

A cornerstone of our understanding of both biological and artificial neural networks is that they store information in the strengths of connections among the constituent neurons. However, in contrast to the well-established theory for quantifying information encoded by the firing patterns of neural networks, little is known about quantifying information encoded by its synaptic connections. Here, we develop a theoretical framework using continuous Hopfield networks as an exemplar for associative neural networks, and data that follow mixtures of broadly applicable multivariate log-normal distributions. Specifically, we analytically derive the Shannon mutual information between the data and singletons, pairs, triplets, quadruplets, and arbitrary n-tuples of synaptic connections within the network. Our framework corroborates well-established insights about storage capacity of, and distributed coding by, neural firing patterns. Strikingly, it discovers synergistic interactions among synapses, revealing that the information encoded jointly by all the synapses exceeds the 'sum of its parts'. Taken together, this study introduces an interpretable framework for quantitatively understanding information storage in neural networks, one that illustrates the duality of synaptic connectivity and neural population activity in learning and memory.

摘要

我们对生物神经网络和人工神经网络理解的一个基石是,它们将信息存储在组成神经元之间连接的强度中。然而,与用于量化由神经网络放电模式编码信息的成熟理论不同,对于量化由其突触连接编码的信息知之甚少。在这里,我们以连续霍普菲尔德网络作为联想神经网络的示例,以及遵循广泛适用的多元对数正态分布混合的数据,来开发一个理论框架。具体来说,我们通过分析得出网络内数据与突触连接的单例、对、三元组、四元组和任意n元组之间的香农互信息。我们的框架证实了关于神经放电模式的存储容量和分布式编码的既定见解。引人注目的是,它发现了突触之间的协同相互作用,揭示了所有突触共同编码的信息超过了“其各部分之和”。综上所述,本研究引入了一个可解释的框架,用于定量理解神经网络中的信息存储,该框架说明了突触连接性和神经群体活动在学习和记忆中的二元性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/b64936d90af3/nihpp-2411.17692v1-f0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/a81b45bf1b66/nihpp-2411.17692v1-f0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/56fb50b7fc73/nihpp-2411.17692v1-f0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/912d96d45e9f/nihpp-2411.17692v1-f0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/b64936d90af3/nihpp-2411.17692v1-f0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/a81b45bf1b66/nihpp-2411.17692v1-f0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/56fb50b7fc73/nihpp-2411.17692v1-f0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/912d96d45e9f/nihpp-2411.17692v1-f0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/07da/11623702/b64936d90af3/nihpp-2411.17692v1-f0004.jpg

相似文献

1
Quantifying information stored in synaptic connections rather than in firing patterns of neural networks.量化存储在突触连接而非神经网络放电模式中的信息。
ArXiv. 2024 Nov 26:arXiv:2411.17692v1.
2
Quantifying information stored in synaptic connections rather than in firing patterns of neural networks.量化存储在突触连接而非神经网络放电模式中的信息。
bioRxiv. 2024 Nov 26:2024.11.26.625447. doi: 10.1101/2024.11.26.625447.
3
Storage capacity of networks with discrete synapses and sparsely encoded memories.具有离散突触和稀疏编码记忆的网络的存储容量
Phys Rev E. 2022 May;105(5-1):054408. doi: 10.1103/PhysRevE.105.054408.
4
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks.超越霍普菲尔德递归神经网络中的最大存储容量限制。
Entropy (Basel). 2019 Jul 25;21(8):726. doi: 10.3390/e21080726.
5
Firing rate distributions in plastic networks of spiking neurons.脉冲神经元可塑性网络中的放电率分布
Netw Neurosci. 2025 Mar 20;9(1):447-474. doi: 10.1162/netn_a_00442. eCollection 2025.
6
Selective connectivity enhances storage capacity in attractor models of memory function.选择性连接增强了记忆功能吸引子模型中的存储容量。
Front Syst Neurosci. 2022 Sep 15;16:983147. doi: 10.3389/fnsys.2022.983147. eCollection 2022.
7
A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.一种三阈值学习规则接近循环神经网络的最大容量。
PLoS Comput Biol. 2015 Aug 20;11(8):e1004439. doi: 10.1371/journal.pcbi.1004439. eCollection 2015 Aug.
8
Homeostatic control of synaptic rewiring in recurrent networks induces the formation of stable memory engrams.内稳态控制递归网络中的突触重连诱导稳定的记忆印痕形成。
PLoS Comput Biol. 2022 Feb 10;18(2):e1009836. doi: 10.1371/journal.pcbi.1009836. eCollection 2022 Feb.
9
Time-domain brain: temporal mechanisms for brain functions using time-delay nets, holographic processes, radio communications, and emergent oscillatory sequences.时域大脑:利用时延网络、全息过程、无线电通信和涌现振荡序列实现大脑功能的时间机制。
Front Comput Neurosci. 2025 Feb 18;19:1540532. doi: 10.3389/fncom.2025.1540532. eCollection 2025.
10
Learning neural connectivity from firing activity: efficient algorithms with provable guarantees on topology.从放电活动中学习神经连接性:具有拓扑结构可证保证的高效算法。
J Comput Neurosci. 2018 Apr;44(2):253-272. doi: 10.1007/s10827-018-0678-8. Epub 2018 Feb 20.

本文引用的文献

1
Intermittent rate coding and cue-specific ensembles support working memory.间歇速率编码和线索特异性神经元集群支持工作记忆。
Nature. 2024 Dec;636(8042):422-429. doi: 10.1038/s41586-024-08139-9. Epub 2024 Nov 6.
2
Distinguishing examples while building concepts in hippocampal and artificial networks.在海马体和人工网络中构建概念时进行区分示例。
Nat Commun. 2024 Jan 20;15(1):647. doi: 10.1038/s41467-024-44877-0.
3
Organizing memories for generalization in complementary learning systems.组织记忆以促进互补学习系统的泛化。
Nat Neurosci. 2023 Aug;26(8):1438-1448. doi: 10.1038/s41593-023-01382-9. Epub 2023 Jul 20.
4
A synergistic core for human brain evolution and cognition.人类大脑进化和认知的协同核心。
Nat Neurosci. 2022 Jun;25(6):771-782. doi: 10.1038/s41593-022-01070-0. Epub 2022 May 26.
5
Scaling of sensory information in large neural populations shows signatures of information-limiting correlations.大神经元群体中感觉信息的缩放显示出受信息限制的相关性的特征。
Nat Commun. 2021 Jan 20;12(1):473. doi: 10.1038/s41467-020-20722-y.
6
Distributed coding of choice, action and engagement across the mouse brain.小鼠大脑中选择、动作和参与的分布式编码。
Nature. 2019 Dec;576(7786):266-273. doi: 10.1038/s41586-019-1787-x. Epub 2019 Nov 27.
7
A Tutorial for Information Theory in Neuroscience.神经科学中的信息论教程。
eNeuro. 2018 Sep 11;5(3). doi: 10.1523/ENEURO.0052-18.2018. eCollection 2018 May-Jun.
8
Multilayer motif analysis of brain networks.脑网络的多层基序分析
Chaos. 2017 Apr;27(4):047404. doi: 10.1063/1.4979282.
9
'Activity-silent' working memory in prefrontal cortex: a dynamic coding framework.前额叶皮质中的“静息活动”工作记忆:一个动态编码框架
Trends Cogn Sci. 2015 Jul;19(7):394-405. doi: 10.1016/j.tics.2015.05.004. Epub 2015 Jun 4.
10
Triplet correlations among similarly tuned cells impact population coding.相似调谐细胞之间的三联体相关性影响群体编码。
Front Comput Neurosci. 2015 May 18;9:57. doi: 10.3389/fncom.2015.00057. eCollection 2015.