Suppr超能文献

连接组生长自组织模型中的学习与临界性

Learning and criticality in a self-organizing model of connectome growth.

作者信息

Cirunay Michelle T, Batac Rene C, Ódor Géza

机构信息

Institute of Technical Physics and Materials Science, HUN-REN Centre for Energy Research, P.O. Box 49, 1525, Budapest, Hungary.

Department of Physics, College of Science, De La Salle University, 2401 Taft Ave., Manila, 0922, Philippines.

出版信息

Sci Rep. 2025 Aug 29;15(1):31890. doi: 10.1038/s41598-025-16377-8.

Abstract

The exploration of brain networks has reached an important milestone as relatively large and reliable information has been gathered for connectomes of different species. Analyses of connectome data sets reveal that the structural length follows the exponential rule, the distributions of in- and out-node strengths follow heavy-tailed lognormal statistics, while the functional network properties exhibit powerlaw tails, suggesting that the brain operates close to a critical point where computational capabilities and sensitivity to stimulus is optimal. Because these universal network features emerge from bottom-up (self-)organization, one can pose the question of whether they can be modeled via a common framework, particularly through the lens of criticality of statistical physical systems. Here, we simultaneously reproduce the powerlaw statistics of connectome edge weights and the lognormal distributions of node strengths from an avalanche-type model with learning that operates on baseline networks that mimic the neuronal circuitry. We observe that the avalanches created by a sandpile-like model on simulated neurons connected by a hierarchical modular network (HMN) produce robust powerlaw avalanche size distributions with critical exponents of 3/2 characteristic of neuronal systems. Introducing Hebbian learning, wherein neurons that 'fire together, wire together,' recovers the powerlaw distribution of edge weights and the lognormal distributions of node degrees, comparable to those obtained from connectome data. Our results strengthen the notion of a critical brain, one whose local interactions drive connectivity and learning without a need for external intervention and precise tuning.

摘要

随着已为不同物种的连接组收集到相对大量且可靠的信息,脑网络探索已达到一个重要里程碑。对连接组数据集的分析表明,结构长度遵循指数规则,入节点和出节点强度的分布遵循重尾对数正态统计,而功能网络属性呈现幂律尾部,这表明大脑在接近一个临界点的状态下运作,在该临界点处计算能力和对刺激的敏感度是最优的。由于这些普遍的网络特征源自自下而上的(自我)组织,人们可以提出这样一个问题,即它们是否可以通过一个通用框架来建模,特别是从统计物理系统的临界性角度。在这里,我们从一个具有学习能力的雪崩型模型中同时重现了连接组边权重的幂律统计和节点强度的对数正态分布,该模型作用于模仿神经元回路的基线网络。我们观察到,在由分层模块化网络(HMN)连接的模拟神经元上,一个类似沙堆模型产生的雪崩会产生稳健的幂律雪崩大小分布,其临界指数为3/2,这是神经元系统的特征。引入赫布学习,即“一起放电的神经元,连接在一起”,恢复了边权重的幂律分布和节点度的对数正态分布,与从连接组数据中获得的分布相当。我们的结果强化了临界脑的概念,即其局部相互作用驱动连接性和学习,而无需外部干预和精确调整。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5b6a/12397319/f0a9ab9dbf1f/41598_2025_16377_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验