Suppr超能文献

神经网络中的共享输入和递归用于代谢高效的信息传递。

Shared input and recurrency in neural networks for metabolically efficient information transmission.

机构信息

Laboratory of Computational Neuroscience, Institute of Physiology of the Czech Academy of Sciences, Prague, Czech Republic.

Neural Coding and Brain Computing Unit, Okinawa Institute of Science and Technology, Onna-son, Okinawa, Japan.

出版信息

PLoS Comput Biol. 2024 Feb 23;20(2):e1011896. doi: 10.1371/journal.pcbi.1011896. eCollection 2024 Feb.

Abstract

Shared input to a population of neurons induces noise correlations, which can decrease the information carried by a population activity. Inhibitory feedback in recurrent neural networks can reduce the noise correlations and thus increase the information carried by the population activity. However, the activity of inhibitory neurons is costly. This inhibitory feedback decreases the gain of the population. Thus, depolarization of its neurons requires stronger excitatory synaptic input, which is associated with higher ATP consumption. Given that the goal of neural populations is to transmit as much information as possible at minimal metabolic costs, it is unclear whether the increased information transmission reliability provided by inhibitory feedback compensates for the additional costs. We analyze this problem in a network of leaky integrate-and-fire neurons receiving correlated input. By maximizing mutual information with metabolic cost constraints, we show that there is an optimal strength of recurrent connections in the network, which maximizes the value of mutual information-per-cost. For higher values of input correlation, the mutual information-per-cost is higher for recurrent networks with inhibitory feedback compared to feedforward networks without any inhibitory neurons. Our results, therefore, show that the optimal synaptic strength of a recurrent network can be inferred from metabolically efficient coding arguments and that decorrelation of the input by inhibitory feedback compensates for the associated increased metabolic costs.

摘要

神经元群体的共享输入会产生噪声相关,这会降低群体活动所携带的信息。在递归神经网络中的抑制性反馈可以减少噪声相关性,从而增加群体活动所携带的信息。然而,抑制性神经元的活动是有代价的。这种抑制性反馈会降低群体的增益。因此,其神经元的去极化需要更强的兴奋性突触输入,这与更高的 ATP 消耗有关。鉴于神经元群体的目标是以最小的代谢成本传输尽可能多的信息,尚不清楚抑制性反馈提供的增加的信息传输可靠性是否可以弥补额外的成本。我们在一个接收相关输入的漏失积分和放电神经元网络中分析了这个问题。通过最大化具有代谢成本约束的互信息,我们表明网络中存在一个最优的递归连接强度,它使互信息-成本比最大化。对于更高的输入相关性,与没有任何抑制性神经元的前馈网络相比,具有抑制性反馈的递归网络的互信息-成本更高。因此,我们的研究结果表明,可以从代谢有效的编码论点推断出递归网络的最优突触强度,并且抑制性反馈引起的输入去相关可以弥补相关的增加的代谢成本。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/fd14/10917264/c9835babfe2d/pcbi.1011896.g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验