• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

相似文献

1
Quantifying information stored in synaptic connections rather than in firing patterns of neural networks.量化存储在突触连接而非神经网络放电模式中的信息。
bioRxiv. 2024 Nov 26:2024.11.26.625447. doi: 10.1101/2024.11.26.625447.
2
Quantifying information stored in synaptic connections rather than in firing patterns of neural networks.量化存储在突触连接而非神经网络放电模式中的信息。
ArXiv. 2024 Nov 26:arXiv:2411.17692v1.
3
Storage capacity of networks with discrete synapses and sparsely encoded memories.具有离散突触和稀疏编码记忆的网络的存储容量
Phys Rev E. 2022 May;105(5-1):054408. doi: 10.1103/PhysRevE.105.054408.
4
Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks.超越霍普菲尔德递归神经网络中的最大存储容量限制。
Entropy (Basel). 2019 Jul 25;21(8):726. doi: 10.3390/e21080726.
5
Firing rate distributions in plastic networks of spiking neurons.脉冲神经元可塑性网络中的放电率分布
Netw Neurosci. 2025 Mar 20;9(1):447-474. doi: 10.1162/netn_a_00442. eCollection 2025.
6
Selective connectivity enhances storage capacity in attractor models of memory function.选择性连接增强了记忆功能吸引子模型中的存储容量。
Front Syst Neurosci. 2022 Sep 15;16:983147. doi: 10.3389/fnsys.2022.983147. eCollection 2022.
7
A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks.一种三阈值学习规则接近循环神经网络的最大容量。
PLoS Comput Biol. 2015 Aug 20;11(8):e1004439. doi: 10.1371/journal.pcbi.1004439. eCollection 2015 Aug.
8
Homeostatic control of synaptic rewiring in recurrent networks induces the formation of stable memory engrams.内稳态控制递归网络中的突触重连诱导稳定的记忆印痕形成。
PLoS Comput Biol. 2022 Feb 10;18(2):e1009836. doi: 10.1371/journal.pcbi.1009836. eCollection 2022 Feb.
9
Time-domain brain: temporal mechanisms for brain functions using time-delay nets, holographic processes, radio communications, and emergent oscillatory sequences.时域大脑:利用时延网络、全息过程、无线电通信和涌现振荡序列实现大脑功能的时间机制。
Front Comput Neurosci. 2025 Feb 18;19:1540532. doi: 10.3389/fncom.2025.1540532. eCollection 2025.
10
Learning neural connectivity from firing activity: efficient algorithms with provable guarantees on topology.从放电活动中学习神经连接性:具有拓扑结构可证保证的高效算法。
J Comput Neurosci. 2018 Apr;44(2):253-272. doi: 10.1007/s10827-018-0678-8. Epub 2018 Feb 20.

量化存储在突触连接而非神经网络放电模式中的信息。

Quantifying information stored in synaptic connections rather than in firing patterns of neural networks.

作者信息

Fan Xinhao, Mysore Shreesh P

出版信息

bioRxiv. 2024 Nov 26:2024.11.26.625447. doi: 10.1101/2024.11.26.625447.

DOI:10.1101/2024.11.26.625447
PMID:39651282
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11623622/
Abstract

A cornerstone of our understanding of both biological and artificial neural networks is that they store information in the strengths of connections among the constituent neurons. However, in contrast to the well-established theory for quantifying information encoded by the firing patterns of neural networks, little is known about quantifying information encoded by its synaptic connections. Here, we develop a theoretical framework using continuous Hopfield networks as an exemplar for associative neural networks, and data that follow mixtures of broadly applicable multivariate log-normal distributions. Specifically, we analytically derive the Shannon mutual information between the data and singletons, pairs, triplets, quadruplets, and arbitrary n-tuples of synaptic connections within the network. Our framework corroborates well-established insights about storage capacity of, and distributed coding by, neural firing patterns. Strikingly, it discovers synergistic interactions among synapses, revealing that the information encoded jointly by all the synapses exceeds the 'sum of its parts'. Taken together, this study introduces an interpretable framework for quantitatively understanding information storage in neural networks, one that illustrates the duality of synaptic connectivity and neural population activity in learning and memory.

摘要

我们对生物神经网络和人工神经网络理解的一个基石是,它们通过组成神经元之间连接的强度来存储信息。然而,与用于量化由神经网络放电模式编码的信息的成熟理论不同,对于量化由其突触连接编码的信息却知之甚少。在此,我们以连续霍普菲尔德网络作为关联神经网络的范例,利用遵循广泛适用的多元对数正态分布混合的数据,构建了一个理论框架。具体而言,我们通过解析得出网络内数据与单个突触连接、成对突触连接、三个突触连接、四个突触连接以及任意n元突触连接之间的香农互信息。我们的框架证实了关于神经网络放电模式的存储容量和分布式编码的既定见解。引人注目的是,它发现了突触之间的协同相互作用,揭示了所有突触共同编码的信息超过了“各部分之和”。综上所述,本研究引入了一个用于定量理解神经网络中信息存储的可解释框架,该框架阐释了学习和记忆中突触连接性与神经群体活动的二元性。