Fan Xinhao, Mysore Shreesh P
bioRxiv. 2024 Nov 26:2024.11.26.625447. doi: 10.1101/2024.11.26.625447.
A cornerstone of our understanding of both biological and artificial neural networks is that they store information in the strengths of connections among the constituent neurons. However, in contrast to the well-established theory for quantifying information encoded by the firing patterns of neural networks, little is known about quantifying information encoded by its synaptic connections. Here, we develop a theoretical framework using continuous Hopfield networks as an exemplar for associative neural networks, and data that follow mixtures of broadly applicable multivariate log-normal distributions. Specifically, we analytically derive the Shannon mutual information between the data and singletons, pairs, triplets, quadruplets, and arbitrary n-tuples of synaptic connections within the network. Our framework corroborates well-established insights about storage capacity of, and distributed coding by, neural firing patterns. Strikingly, it discovers synergistic interactions among synapses, revealing that the information encoded jointly by all the synapses exceeds the 'sum of its parts'. Taken together, this study introduces an interpretable framework for quantitatively understanding information storage in neural networks, one that illustrates the duality of synaptic connectivity and neural population activity in learning and memory.
我们对生物神经网络和人工神经网络理解的一个基石是,它们通过组成神经元之间连接的强度来存储信息。然而,与用于量化由神经网络放电模式编码的信息的成熟理论不同,对于量化由其突触连接编码的信息却知之甚少。在此,我们以连续霍普菲尔德网络作为关联神经网络的范例,利用遵循广泛适用的多元对数正态分布混合的数据,构建了一个理论框架。具体而言,我们通过解析得出网络内数据与单个突触连接、成对突触连接、三个突触连接、四个突触连接以及任意n元突触连接之间的香农互信息。我们的框架证实了关于神经网络放电模式的存储容量和分布式编码的既定见解。引人注目的是,它发现了突触之间的协同相互作用,揭示了所有突触共同编码的信息超过了“各部分之和”。综上所述,本研究引入了一个用于定量理解神经网络中信息存储的可解释框架,该框架阐释了学习和记忆中突触连接性与神经群体活动的二元性。