Fan Xinhao, Mysore Shreesh P
Department of Neuroscience, Johns Hopkins University, Baltimore, MD, USA.
Department of Psychological and Brain Sciences, Johns Hopkins University, Baltimore, MD, USA.
ArXiv. 2024 Nov 26:arXiv:2411.17692v1.
A cornerstone of our understanding of both biological and artificial neural networks is that they store information in the strengths of connections among the constituent neurons. However, in contrast to the well-established theory for quantifying information encoded by the firing patterns of neural networks, little is known about quantifying information encoded by its synaptic connections. Here, we develop a theoretical framework using continuous Hopfield networks as an exemplar for associative neural networks, and data that follow mixtures of broadly applicable multivariate log-normal distributions. Specifically, we analytically derive the Shannon mutual information between the data and singletons, pairs, triplets, quadruplets, and arbitrary n-tuples of synaptic connections within the network. Our framework corroborates well-established insights about storage capacity of, and distributed coding by, neural firing patterns. Strikingly, it discovers synergistic interactions among synapses, revealing that the information encoded jointly by all the synapses exceeds the 'sum of its parts'. Taken together, this study introduces an interpretable framework for quantitatively understanding information storage in neural networks, one that illustrates the duality of synaptic connectivity and neural population activity in learning and memory.
我们对生物神经网络和人工神经网络理解的一个基石是,它们将信息存储在组成神经元之间连接的强度中。然而,与用于量化由神经网络放电模式编码信息的成熟理论不同,对于量化由其突触连接编码的信息知之甚少。在这里,我们以连续霍普菲尔德网络作为联想神经网络的示例,以及遵循广泛适用的多元对数正态分布混合的数据,来开发一个理论框架。具体来说,我们通过分析得出网络内数据与突触连接的单例、对、三元组、四元组和任意n元组之间的香农互信息。我们的框架证实了关于神经放电模式的存储容量和分布式编码的既定见解。引人注目的是,它发现了突触之间的协同相互作用,揭示了所有突触共同编码的信息超过了“其各部分之和”。综上所述,本研究引入了一个可解释的框架,用于定量理解神经网络中的信息存储,该框架说明了突触连接性和神经群体活动在学习和记忆中的二元性。