• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用非线性树突适应性计算训练深度尖峰神经网络。

Exploiting nonlinear dendritic adaptive computation in training deep Spiking Neural Networks.

机构信息

Brain-Inspired Cognitive Intelligence Lab, Institute of Automation, Chinese Academy of Sciences (CAS), Beijing, China; School of Future Technology, University of Chinese Academy of Sciences, Beijing, China.

Brain-Inspired Cognitive Intelligence Lab, Institute of Automation, Chinese Academy of Sciences (CAS), Beijing, China.

出版信息

Neural Netw. 2024 Feb;170:190-201. doi: 10.1016/j.neunet.2023.10.056. Epub 2023 Nov 10.

DOI:10.1016/j.neunet.2023.10.056
PMID:37989040
Abstract

Inspired by the information transmission process in the brain, Spiking Neural Networks (SNNs) have gained considerable attention due to their event-driven nature. However, as the network structure grows complex, managing the spiking behavior within the network becomes challenging. Networks with excessively dense or sparse spikes fail to transmit sufficient information, inhibiting SNNs from exhibiting superior performance. Current SNNs linearly sum presynaptic information in postsynaptic neurons, overlooking the adaptive adjustment effect of dendrites on information processing. In this study, we introduce the Dendritic Spatial Gating Module (DSGM), which scales and translates the input, reducing the loss incurred when transforming the continuous membrane potential into discrete spikes. Simultaneously, by implementing the Dendritic Temporal Adjust Module (DTAM), dendrites assign different importance to inputs of different time steps, facilitating the establishment of the temporal dependency of spiking neurons and effectively integrating multi-step time information. The fusion of these two modules results in a more balanced spike representation within the network, significantly enhancing the neural network's performance. This approach has achieved state-of-the-art performance on static image datasets, including CIFAR10 and CIFAR100, as well as event datasets like DVS-CIFAR10, DVS-Gesture, and N-Caltech101. It also demonstrates competitive performance compared to the current state-of-the-art on the ImageNet dataset.

摘要

受大脑信息传输过程的启发,由于具有事件驱动的特性,尖峰神经网络(SNN)受到了相当多的关注。然而,随着网络结构变得越来越复杂,管理网络内的尖峰行为变得具有挑战性。具有过于密集或稀疏尖峰的网络无法传输足够的信息,从而抑制了 SNN 表现出优异的性能。当前的 SNN 在线性地对突触后神经元中的突触前信息进行求和,忽略了树突对信息处理的自适应调整作用。在本研究中,我们引入了树突空间门控模块(DSGM),它对输入进行缩放和转换,减少了将连续膜电位转换为离散尖峰时的损失。同时,通过实现树突时间调整模块(DTAM),树突为不同时间步的输入分配不同的重要性,有利于建立尖峰神经元的时间依赖性,并有效地整合多步时间信息。这两个模块的融合导致网络内的尖峰表示更加平衡,显著提高了神经网络的性能。这种方法在静态图像数据集(包括 CIFAR10 和 CIFAR100)以及事件数据集(如 DVS-CIFAR10、DVS-Gesture 和 N-Caltech101)上都取得了最先进的性能。与 ImageNet 数据集上当前的最先进方法相比,它也表现出了竞争力。

相似文献

1
Exploiting nonlinear dendritic adaptive computation in training deep Spiking Neural Networks.利用非线性树突适应性计算训练深度尖峰神经网络。
Neural Netw. 2024 Feb;170:190-201. doi: 10.1016/j.neunet.2023.10.056. Epub 2023 Nov 10.
2
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing.深度尖峰神经网络在动态视觉传感中的优化。
Neural Netw. 2021 Dec;144:686-698. doi: 10.1016/j.neunet.2021.09.022. Epub 2021 Oct 5.
3
SGLFormer: Spiking Global-Local-Fusion Transformer with high performance.SGLFormer:具有高性能的脉冲全局-局部融合变压器。
Front Neurosci. 2024 Mar 12;18:1371290. doi: 10.3389/fnins.2024.1371290. eCollection 2024.
4
BackEISNN: A deep spiking neural network with adaptive self-feedback and balanced excitatory-inhibitory neurons.BackEISNN:一种具有自适应自反馈和平衡兴奋抑制神经元的深度尖峰神经网络。
Neural Netw. 2022 Oct;154:68-77. doi: 10.1016/j.neunet.2022.06.036. Epub 2022 Jul 11.
5
Sparser spiking activity can be better: Feature Refine-and-Mask spiking neural network for event-based visual recognition.稀疏尖峰活动可以更好:基于事件的视觉识别的特征细化和掩蔽尖峰神经网络。
Neural Netw. 2023 Sep;166:410-423. doi: 10.1016/j.neunet.2023.07.008. Epub 2023 Jul 20.
6
An unsupervised STDP-based spiking neural network inspired by biologically plausible learning rules and connections.一种基于无监督 STDP 的尖峰神经网络,灵感来自于具有生物学合理性的学习规则和连接。
Neural Netw. 2023 Aug;165:799-808. doi: 10.1016/j.neunet.2023.06.019. Epub 2023 Jun 22.
7
Brain-inspired neural circuit evolution for spiking neural networks.基于脑启发的尖峰神经网络神经电路进化。
Proc Natl Acad Sci U S A. 2023 Sep 26;120(39):e2218173120. doi: 10.1073/pnas.2218173120. Epub 2023 Sep 20.
8
SPIDE: A purely spike-based method for training feedback spiking neural networks.SPIDE:一种用于训练反馈脉冲神经网络的纯基于脉冲的方法。
Neural Netw. 2023 Apr;161:9-24. doi: 10.1016/j.neunet.2023.01.026. Epub 2023 Jan 24.
9
An exact mapping from ReLU networks to spiking neural networks.ReLU 网络到尖峰神经网络的精确映射。
Neural Netw. 2023 Nov;168:74-88. doi: 10.1016/j.neunet.2023.09.011. Epub 2023 Sep 11.
10
Sharing leaky-integrate-and-fire neurons for memory-efficient spiking neural networks.用于高效内存脉冲神经网络的共享泄漏积分发放神经元
Front Neurosci. 2023 Jul 31;17:1230002. doi: 10.3389/fnins.2023.1230002. eCollection 2023.