• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

介绍突触整合原理在深度神经网络优化中的应用。

Introducing principles of synaptic integration in the optimization of deep neural networks.

机构信息

IBM Research - Zurich, Rüschlikon, Switzerland.

Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland.

出版信息

Nat Commun. 2022 Apr 7;13(1):1885. doi: 10.1038/s41467-022-29491-2.

DOI:10.1038/s41467-022-29491-2
PMID:35393422
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8989917/
Abstract

Plasticity circuits in the brain are known to be influenced by the distribution of the synaptic weights through the mechanisms of synaptic integration and local regulation of synaptic strength. However, the complex interplay of stimulation-dependent plasticity with local learning signals is disregarded by most of the artificial neural network training algorithms devised so far. Here, we propose a novel biologically inspired optimizer for artificial and spiking neural networks that incorporates key principles of synaptic plasticity observed in cortical dendrites: GRAPES (Group Responsibility for Adjusting the Propagation of Error Signals). GRAPES implements a weight-distribution-dependent modulation of the error signal at each node of the network. We show that this biologically inspired mechanism leads to a substantial improvement of the performance of artificial and spiking networks with feedforward, convolutional, and recurrent architectures, it mitigates catastrophic forgetting, and it is optimally suited for dedicated hardware implementations. Overall, our work indicates that reconciling neurophysiology insights with machine intelligence is key to boosting the performance of neural networks.

摘要

大脑中的可塑性电路已知受到突触权重分布的影响,通过突触整合和局部调节突触强度的机制。然而,迄今为止设计的大多数人工神经网络训练算法都忽略了刺激依赖性可塑性与局部学习信号的复杂相互作用。在这里,我们提出了一种新的受生物启发的人工和尖峰神经网络优化器,该优化器结合了皮质树突中观察到的突触可塑性的关键原则:GRAPES(用于调整误差信号传播的组责任)。GRAPES 在网络的每个节点实现了对误差信号的与权重分布相关的调制。我们表明,这种受生物启发的机制导致前馈、卷积和递归架构的人工和尖峰网络的性能得到了显著提高,它减轻了灾难性遗忘,并且非常适合专用硬件实现。总的来说,我们的工作表明,将神经生理学的见解与机器智能相结合是提高神经网络性能的关键。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/0767b5153081/41467_2022_29491_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/5e5f00be0c72/41467_2022_29491_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/2f21220eb041/41467_2022_29491_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/afea78dac2b5/41467_2022_29491_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/83face88760e/41467_2022_29491_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/5d52ae62a4c1/41467_2022_29491_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/6d77d9ac1368/41467_2022_29491_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/0767b5153081/41467_2022_29491_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/5e5f00be0c72/41467_2022_29491_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/2f21220eb041/41467_2022_29491_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/afea78dac2b5/41467_2022_29491_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/83face88760e/41467_2022_29491_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/5d52ae62a4c1/41467_2022_29491_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/6d77d9ac1368/41467_2022_29491_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a573/8989917/0767b5153081/41467_2022_29491_Fig7_HTML.jpg

相似文献

1
Introducing principles of synaptic integration in the optimization of deep neural networks.介绍突触整合原理在深度神经网络优化中的应用。
Nat Commun. 2022 Apr 7;13(1):1885. doi: 10.1038/s41467-022-29491-2.
2
Adaptive structure evolution and biologically plausible synaptic plasticity for recurrent spiking neural networks.递归尖峰神经网络的自适应结构进化和具有生物合理性的突触可塑性。
Sci Rep. 2023 Oct 7;13(1):16924. doi: 10.1038/s41598-023-43488-x.
3
Brain-inspired neural circuit evolution for spiking neural networks.基于脑启发的尖峰神经网络神经电路进化。
Proc Natl Acad Sci U S A. 2023 Sep 26;120(39):e2218173120. doi: 10.1073/pnas.2218173120. Epub 2023 Sep 20.
4
HybridSNN: Combining Bio-Machine Strengths by Boosting Adaptive Spiking Neural Networks.HybridSNN:通过提升自适应尖峰神经网络来结合生物机器的优势。
IEEE Trans Neural Netw Learn Syst. 2023 Sep;34(9):5841-5855. doi: 10.1109/TNNLS.2021.3131356. Epub 2023 Sep 1.
5
A review of learning in biologically plausible spiking neural networks.生物启发式尖峰神经网络学习的综述。
Neural Netw. 2020 Feb;122:253-272. doi: 10.1016/j.neunet.2019.09.036. Epub 2019 Oct 11.
6
Neuromorphic implementations of neurobiological learning algorithms for spiking neural networks.用于脉冲神经网络的神经生物学学习算法的神经形态实现。
Neural Netw. 2015 Dec;72:152-67. doi: 10.1016/j.neunet.2015.07.004. Epub 2015 Aug 18.
7
Continuous learning of spiking networks trained with local rules.基于局部规则训练的尖峰网络的持续学习。
Neural Netw. 2022 Nov;155:512-522. doi: 10.1016/j.neunet.2022.09.003. Epub 2022 Sep 7.
8
An unsupervised STDP-based spiking neural network inspired by biologically plausible learning rules and connections.一种基于无监督 STDP 的尖峰神经网络,灵感来自于具有生物学合理性的学习规则和连接。
Neural Netw. 2023 Aug;165:799-808. doi: 10.1016/j.neunet.2023.06.019. Epub 2023 Jun 22.
9
Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation.通过对抗性神经修剪和突触巩固来克服长期灾难性遗忘。
IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4243-4256. doi: 10.1109/TNNLS.2021.3056201. Epub 2022 Aug 31.
10
Mirrored STDP Implements Autoencoder Learning in a Network of Spiking Neurons.镜像脉冲时间依赖可塑性在脉冲神经元网络中实现自动编码器学习。
PLoS Comput Biol. 2015 Dec 3;11(12):e1004566. doi: 10.1371/journal.pcbi.1004566. eCollection 2015 Dec.

引用本文的文献

1
Personalized brain models link cognitive decline progression to underlying synaptic and connectivity degeneration.个性化大脑模型将认知衰退进程与潜在的突触和连接性退化联系起来。
Alzheimers Res Ther. 2025 Apr 5;17(1):74. doi: 10.1186/s13195-025-01718-6.
2
Explaining cocktail party effect and McGurk effect with a spiking neural network improved by Motif-topology.用通过基序拓扑改进的脉冲神经网络解释鸡尾酒会效应和麦格克效应。
Front Neurosci. 2023 Mar 20;17:1132269. doi: 10.3389/fnins.2023.1132269. eCollection 2023.
3
Synaptic Transistors Based on PVA: Chitosan Biopolymer Blended Electric-Double-Layer with High Ionic Conductivity.

本文引用的文献

1
Dendritic normalisation improves learning in sparsely connected artificial neural networks.树突正常化可改善稀疏连接人工神经网络的学习能力。
PLoS Comput Biol. 2021 Aug 9;17(8):e1009202. doi: 10.1371/journal.pcbi.1009202. eCollection 2021 Aug.
2
Learning Without Feedback: Fixed Random Learning Signals Allow for Feedforward Training of Deep Neural Networks.无反馈学习:固定随机学习信号实现深度神经网络的前馈训练
Front Neurosci. 2021 Feb 10;15:629892. doi: 10.3389/fnins.2021.629892. eCollection 2021.
3
The Synaptic Scaling Literature: A Systematic Review of Methodologies and Quality of Reporting.
基于聚乙烯醇:壳聚糖生物聚合物共混高离子电导率双电层的突触晶体管。
Polymers (Basel). 2023 Feb 10;15(4):896. doi: 10.3390/polym15040896.
4
Editorial: Closed-loop iterations between neuroscience and artificial intelligence.社论:神经科学与人工智能之间的闭环迭代
Front Syst Neurosci. 2022 Dec 19;16:1002095. doi: 10.3389/fnsys.2022.1002095. eCollection 2022.
突触缩放文献:方法学与报告质量的系统综述
Front Cell Neurosci. 2020 Jun 16;14:164. doi: 10.3389/fncel.2020.00164. eCollection 2020.
4
Effects of synaptic integration on the dynamics and computational performance of spiking neural network.突触整合对脉冲神经网络动力学和计算性能的影响。
Cogn Neurodyn. 2020 Jun;14(3):347-357. doi: 10.1007/s11571-020-09572-y. Epub 2020 Feb 19.
5
Heterosynaptic Plasticity in Cortical Interneurons.皮层中间神经元的异突触可塑性
J Neurosci. 2020 Feb 26;40(9):1793-1794. doi: 10.1523/JNEUROSCI.2567-19.2020.
6
Toward Training Recurrent Neural Networks for Lifelong Learning.面向终身学习的循环神经网络训练
Neural Comput. 2020 Jan;32(1):1-35. doi: 10.1162/neco_a_01246. Epub 2019 Nov 8.
7
A deep learning framework for neuroscience.深度学习在神经科学中的应用框架。
Nat Neurosci. 2019 Nov;22(11):1761-1770. doi: 10.1038/s41593-019-0520-2. Epub 2019 Oct 28.
8
Engineering a Less Artificial Intelligence.设计一个不那么人工智能的人工智能。
Neuron. 2019 Sep 25;103(6):967-979. doi: 10.1016/j.neuron.2019.08.034.
9
Theories of Error Back-Propagation in the Brain.大脑中的误差反向传播理论。
Trends Cogn Sci. 2019 Mar;23(3):235-250. doi: 10.1016/j.tics.2018.12.005. Epub 2019 Jan 28.
10
Deep Learning With Spiking Neurons: Opportunities and Challenges.基于脉冲神经元的深度学习:机遇与挑战。
Front Neurosci. 2018 Oct 25;12:774. doi: 10.3389/fnins.2018.00774. eCollection 2018.