• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

发育修剪的信息理论:使用局部突触规则优化全局网络结构。

The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.

机构信息

Max Planck School of Cognition, Leipzig, Germany.

University of Cambridge, Engineering Department, Cambridge, United Kingdom.

出版信息

PLoS Comput Biol. 2021 Oct 11;17(10):e1009458. doi: 10.1371/journal.pcbi.1009458. eCollection 2021 Oct.

DOI:10.1371/journal.pcbi.1009458
PMID:34634045
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8584672/
Abstract

During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and the processes that determine which synapses and neurons are ultimately pruned, remains unclear. We study the mechanisms and significance of neural pruning in model neural networks. In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available measure of importance based on Fisher information allows the network to identify structurally important vs. unimportant connections and neurons. This locally-available measure of importance has a biological interpretation in terms of the correlations between presynaptic and postsynaptic neurons, and implies an efficient activity-driven pruning rule. Overall, we show how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture. We relate these findings to biology as follows: (I) Synaptic over-production is necessary for activity-dependent connectivity optimization. (II) In networks that have more neurons than needed, cells compete for activity, and only the most important and selective neurons are retained. (III) Cells may also be pruned due to a loss of synapses on their axons. This occurs when the information they convey is not relevant to the target population.

摘要

在发育过程中,生物神经网络产生的突触和神经元数量超过了所需的数量。这些突触和神经元中的许多在称为神经修剪的过程中被去除。为什么网络最初会过度繁殖,以及决定哪些突触和神经元最终被修剪的过程仍然不清楚。我们在模型神经网络中研究神经修剪的机制和意义。在感觉编码的深度玻尔兹曼机模型中,我们发现:(1)突触修剪是学习保留计算相关连接的有效网络架构所必需的;(2)仅通过突触权重进行修剪不能优化网络大小;(3)基于基于 Fisher 信息的局部可用重要性度量进行修剪可以使网络识别结构上重要的与不重要的连接和神经元。这种局部可用的重要性度量在突触前和突触后神经元之间的相关性方面具有生物学解释,并且意味着有效的活动驱动的修剪规则。总的来说,我们展示了局部活动依赖性突触修剪如何解决优化网络架构的全局问题。我们将这些发现与生物学联系起来:(I)突触过度产生对于活动依赖性连接优化是必要的。(II)在神经元数量超过所需数量的网络中,细胞会竞争活动,只有最重要和选择性最高的神经元会被保留。(III)细胞也可能由于其轴突上的突触丢失而被修剪。当它们传达的信息与目标群体无关时,就会发生这种情况。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/40b9817a740f/pcbi.1009458.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/672a5c7fa0d9/pcbi.1009458.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/a627e73f5440/pcbi.1009458.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/13ce581b4c90/pcbi.1009458.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/40b9817a740f/pcbi.1009458.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/672a5c7fa0d9/pcbi.1009458.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/a627e73f5440/pcbi.1009458.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/13ce581b4c90/pcbi.1009458.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d481/8584672/40b9817a740f/pcbi.1009458.g004.jpg

相似文献

1
The information theory of developmental pruning: Optimizing global network architectures using local synaptic rules.发育修剪的信息理论:使用局部突触规则优化全局网络结构。
PLoS Comput Biol. 2021 Oct 11;17(10):e1009458. doi: 10.1371/journal.pcbi.1009458. eCollection 2021 Oct.
2
Neuronal regulation: A mechanism for synaptic pruning during brain maturation.神经元调节:大脑成熟过程中突触修剪的一种机制。
Neural Comput. 1999 Nov 15;11(8):2061-80. doi: 10.1162/089976699300016089.
3
Dynamically Optimizing Network Structure Based on Synaptic Pruning in the Brain.基于大脑突触修剪动态优化网络结构
Front Syst Neurosci. 2021 Jun 4;15:620558. doi: 10.3389/fnsys.2021.620558. eCollection 2021.
4
Rules of engagement: factors that regulate activity-dependent synaptic plasticity during neural network development.作用机制:神经网络发育过程中调节活动依赖型突触可塑性的因素。
Biol Bull. 2010 Oct;219(2):81-99. doi: 10.1086/BBLv219n2p81.
5
A few strong connections: optimizing information retention in neuronal avalanches.少量强连接:优化神经元爆发中的信息保持。
BMC Neurosci. 2010 Jan 6;11:3. doi: 10.1186/1471-2202-11-3.
6
Decreasing-Rate Pruning Optimizes the Construction of Efficient and Robust Distributed Networks.递减率剪枝优化高效且稳健的分布式网络构建。
PLoS Comput Biol. 2015 Jul 28;11(7):e1004347. doi: 10.1371/journal.pcbi.1004347. eCollection 2015 Jul.
7
EvoPruneDeepTL: An evolutionary pruning model for transfer learning based deep neural networks.EvoPruneDeepTL:一种用于基于迁移学习的深度神经网络的进化剪枝模型。
Neural Netw. 2023 Jan;158:59-82. doi: 10.1016/j.neunet.2022.10.011. Epub 2022 Nov 4.
8
Cell-type-specific neuromodulation guides synaptic credit assignment in a spiking neural network.细胞类型特异性神经调节指导脉冲神经网络中的突触信用分配。
Proc Natl Acad Sci U S A. 2021 Dec 21;118(51). doi: 10.1073/pnas.2111821118.
9
Pruning recurrent neural networks replicates adolescent changes in working memory and reinforcement learning.修剪递归神经网络复制了工作记忆和强化学习中的青少年变化。
Proc Natl Acad Sci U S A. 2022 May 31;119(22):e2121331119. doi: 10.1073/pnas.2121331119. Epub 2022 May 27.
10
Optimal pruning in neural networks.神经网络中的最优剪枝
Phys Rev E Stat Phys Plasmas Fluids Relat Interdiscip Topics. 2000 Dec;62(6 Pt B):8387-94. doi: 10.1103/physreve.62.8387.

引用本文的文献

1
Pruning recurrent neural networks replicates adolescent changes in working memory and reinforcement learning.修剪递归神经网络复制了工作记忆和强化学习中的青少年变化。
Proc Natl Acad Sci U S A. 2022 May 31;119(22):e2121331119. doi: 10.1073/pnas.2121331119. Epub 2022 May 27.
2
Periodicity Pitch Perception Part III: Sensibility and Pachinko Volatility.周期性音高感知第三部分:敏感性与弹珠机波动性
Front Neurosci. 2022 Mar 8;16:736642. doi: 10.3389/fnins.2022.736642. eCollection 2022.

本文引用的文献

1
New role for circuit expansion for learning in neural networks.神经网络中电路扩展用于学习的新作用。
Phys Rev E. 2021 Feb;103(2-1):022404. doi: 10.1103/PhysRevE.103.022404.
2
Contrastive Similarity Matching for Supervised Learning.监督学习中的对比相似性匹配。
Neural Comput. 2021 Apr 13;33(5):1300-1328. doi: 10.1162/neco_a_01374.
3
Optimal Encoding in Stochastic Latent-Variable Models.随机潜变量模型中的最优编码
Entropy (Basel). 2020 Jun 28;22(7):714. doi: 10.3390/e22070714.
4
Homeostatic mechanisms regulate distinct aspects of cortical circuit dynamics.体内平衡机制调节皮质电路动力学的不同方面。
Proc Natl Acad Sci U S A. 2020 Sep 29;117(39):24514-24525. doi: 10.1073/pnas.1918368117. Epub 2020 Sep 11.
5
Contrastive Hebbian Feedforward Learning for Neural Networks.对比Hebbian 前馈学习在神经网络中的应用。
IEEE Trans Neural Netw Learn Syst. 2020 Jun;31(6):2118-2128. doi: 10.1109/TNNLS.2019.2927957. Epub 2019 Jul 31.
6
Fundamental bounds on learning performance in neural circuits.神经回路学习性能的基本界限。
Proc Natl Acad Sci U S A. 2019 May 21;116(21):10537-10546. doi: 10.1073/pnas.1813416116. Epub 2019 May 6.
7
Increased synapse elimination by microglia in schizophrenia patient-derived models of synaptic pruning.小胶质细胞导致精神分裂症患者来源的突触修剪模型中突触消除增加。
Nat Neurosci. 2019 Mar;22(3):374-385. doi: 10.1038/s41593-018-0334-7. Epub 2019 Feb 4.
8
Blindfold learning of an accurate neural metric.盲态学习的准确神经指标。
Proc Natl Acad Sci U S A. 2018 Mar 27;115(13):3267-3272. doi: 10.1073/pnas.1718710115. Epub 2018 Mar 12.
9
Why Do Similarity Matching Objectives Lead to Hebbian/Anti-Hebbian Networks?为什么相似性匹配目标会导致赫布式/反赫布式网络?
Neural Comput. 2018 Jan;30(1):84-124. doi: 10.1162/neco_a_01018. Epub 2017 Sep 28.
10
Errant gardeners: glial-cell-dependent synaptic pruning and neurodevelopmental disorders.错误的园丁:神经胶质细胞依赖性的突触修剪和神经发育障碍。
Nat Rev Neurosci. 2017 Nov;18(11):658-670. doi: 10.1038/nrn.2017.110. Epub 2017 Sep 21.