• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于多房室时空反向传播的学习高效脉冲神经网络。

Learning-efficient spiking neural networks with multi-compartment spatio-temporal backpropagation.

作者信息

Liu Yuqian, Wang Yuechao, Zhang Chi, Yu Liao, Fang Ying, Chen Feng

机构信息

Department of Automation, Tsinghua University, Beijing 100084, China.

The College of Computer and Cyber Security, Fujian Normal University, Fuzhou 350117, China.

出版信息

iScience. 2025 Jun 3;28(7):112491. doi: 10.1016/j.isci.2025.112491. eCollection 2025 Jul 18.

DOI:10.1016/j.isci.2025.112491
PMID:40703089
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12284053/
Abstract

Spiking neural networks (SNNs) inspired by biological neurons offer energy-efficient and interpretable computation but is limited by the simplistic structure of point neurons. We introduce a multi-compartment spiking neuron model (MCN) with trainable cross-compartment connections that simulate soma-dendrite interactions. Theoretically, we prove that these connections act as spatiotemporal momentum, guiding learning dynamics toward global optima. To leverage this, we propose a multi-compartment spatiotemporal backpropagation (MCST-BP) algorithm that enhances gradient flow stability. Experimental results for multiple benchmark datasets, including S-MNIST, CIFAR-10, Spiking Heidelberg Digits (SHD), and ECG, show that MC-SNNs outperform traditional SNNs in both convergence speed and accuracy. Our work bridges neurobiological structure and computational modeling, providing a theoretical and practical foundation for high-performance brain-inspired learning systems.

摘要

受生物神经元启发的脉冲神经网络(SNN)提供了节能且可解释的计算方式,但受限于点神经元的简单结构。我们引入了一种具有可训练跨隔室连接的多隔室脉冲神经元模型(MCN),该模型模拟了胞体 - 树突相互作用。从理论上讲,我们证明这些连接充当时空动量,引导学习动态朝着全局最优解发展。为了利用这一点,我们提出了一种多隔室时空反向传播(MCST - BP)算法,该算法增强了梯度流稳定性。针对多个基准数据集的实验结果,包括S - MNIST、CIFAR - 10、脉冲海德堡数字(SHD)和心电图(ECG),表明多隔室脉冲神经网络(MC - SNN)在收敛速度和准确性方面均优于传统的脉冲神经网络。我们的工作搭建了神经生物学结构与计算建模之间的桥梁,为高性能脑启发学习系统提供了理论和实践基础。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/78a22ca21bdf/gr4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/efd44ca9598b/fx1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/7a72ccc2d884/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/8ab37bd1844e/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/a16c4aa8bfc9/gr3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/78a22ca21bdf/gr4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/efd44ca9598b/fx1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/7a72ccc2d884/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/8ab37bd1844e/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/a16c4aa8bfc9/gr3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f23/12284053/78a22ca21bdf/gr4.jpg

相似文献

1
Learning-efficient spiking neural networks with multi-compartment spatio-temporal backpropagation.基于多房室时空反向传播的学习高效脉冲神经网络。
iScience. 2025 Jun 3;28(7):112491. doi: 10.1016/j.isci.2025.112491. eCollection 2025 Jul 18.
2
Combining aggregated attention and transformer architecture for accurate and efficient performance of Spiking Neural Networks.结合聚合注意力和Transformer架构以实现脉冲神经网络的准确高效性能。
Neural Netw. 2025 Jul 3;191:107789. doi: 10.1016/j.neunet.2025.107789.
3
Spiking Neural Networks for Multimodal Neuroimaging: A Comprehensive Review of Current Trends and the NeuCube Brain-Inspired Architecture.用于多模态神经成像的脉冲神经网络:当前趋势及NeuCube脑启发式架构的全面综述
Bioengineering (Basel). 2025 Jun 9;12(6):628. doi: 10.3390/bioengineering12060628.
4
ESTSformer: Efficient spatio-temporal spiking transformer.ESTSformer:高效时空脉冲变压器
Neural Netw. 2025 Nov;191:107786. doi: 10.1016/j.neunet.2025.107786. Epub 2025 Jul 2.
5
The architecture design and training optimization of spiking neural network with low-latency and high-performance for classification and segmentation.用于分类和分割的具有低延迟和高性能的脉冲神经网络的架构设计与训练优化。
Neural Netw. 2025 Jun 21;191:107790. doi: 10.1016/j.neunet.2025.107790.
6
Short-Term Memory Impairment短期记忆障碍
7
Paired competing neurons improving STDP supervised local learning in Spiking Neural Networks.配对竞争神经元改善脉冲神经网络中基于STDP的监督局部学习
Front Neurosci. 2024 Jul 24;18:1401690. doi: 10.3389/fnins.2024.1401690. eCollection 2024.
8
Brain-inspired learning rules for spiking neural network-based control: a tutorial.基于脉冲神经网络控制的受脑启发学习规则:教程
Biomed Eng Lett. 2024 Dec 2;15(1):37-55. doi: 10.1007/s13534-024-00436-6. eCollection 2025 Jan.
9
Deep predictive coding with bi-directional propagation for classification and reconstruction.用于分类和重建的双向传播深度预测编码
Neural Netw. 2025 Nov;191:107785. doi: 10.1016/j.neunet.2025.107785. Epub 2025 Jul 3.
10
Robust Spatiotemporal Prototype Learning for Spiking Neural Networks.用于脉冲神经网络的鲁棒时空原型学习
IEEE Trans Neural Netw Learn Syst. 2025 Jul 4;PP. doi: 10.1109/TNNLS.2025.3583747.

本文引用的文献

1
Hybrid neural networks for continual learning inspired by corticohippocampal circuits.受皮质-海马回路启发的用于持续学习的混合神经网络。
Nat Commun. 2025 Feb 2;16(1):1272. doi: 10.1038/s41467-025-56405-9.
2
Highly efficient neuromorphic learning system of spiking neural network with multi-compartment leaky integrate-and-fire neurons.具有多隔室泄漏积分发放神经元的脉冲神经网络高效神经形态学习系统
Front Neurosci. 2022 Sep 28;16:929644. doi: 10.3389/fnins.2022.929644. eCollection 2022.
3
An Ultrahigh Rate and Stable Zinc Anode by Facet-Matching-Induced Dendrite Regulation.
通过晶面匹配诱导的枝晶调控实现超高速率和稳定的锌负极
Adv Mater. 2022 Sep;34(37):e2203835. doi: 10.1002/adma.202203835. Epub 2022 Aug 12.
4
SHEL5K: An Extended Dataset and Benchmarking for Safety Helmet Detection.SHEL5K:用于安全头盔检测的扩展数据集和基准测试。
Sensors (Basel). 2022 Mar 17;22(6):2315. doi: 10.3390/s22062315.
5
Are Dendrites Conceptually Useful?树突在概念上有用吗?
Neuroscience. 2022 May 1;489:4-14. doi: 10.1016/j.neuroscience.2022.03.008. Epub 2022 Mar 11.
6
Parallel and Recurrent Cascade Models as a Unifying Force for Understanding Subcellular Computation.并行和循环级联模型作为理解亚细胞计算的统一力量。
Neuroscience. 2022 May 1;489:200-215. doi: 10.1016/j.neuroscience.2021.07.026. Epub 2021 Aug 3.
7
Efficient Spike-Driven Learning With Dendritic Event-Based Processing.基于树突事件处理的高效尖峰驱动学习
Front Neurosci. 2021 Feb 19;15:601109. doi: 10.3389/fnins.2021.601109. eCollection 2021.
8
The Remarkable Robustness of Surrogate Gradient Learning for Instilling Complex Function in Spiking Neural Networks.尖峰神经网络中复杂功能的代理梯度学习的显著稳健性。
Neural Comput. 2021 Mar 26;33(4):899-925. doi: 10.1162/neco_a_01367.
9
Spatial Properties of STDP in a Self-Learning Spiking Neural Network Enable Controlling a Mobile Robot.自学习脉冲神经网络中突触可塑性的空间特性助力移动机器人控制
Front Neurosci. 2020 Feb 26;14:88. doi: 10.3389/fnins.2020.00088. eCollection 2020.
10
Dendritic action potentials and computation in human layer 2/3 cortical neurons.人类皮层 2/3 层神经元的树突动作电位和计算。
Science. 2020 Jan 3;367(6473):83-87. doi: 10.1126/science.aax6239.