• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

实现基于尖峰的反向传播以训练深度神经网络架构。

Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.

作者信息

Lee Chankyu, Sarwar Syed Shakib, Panda Priyadarshini, Srinivasan Gopalakrishnan, Roy Kaushik

机构信息

Nanoelectronics Research Laboratory, School of Electrical and Computer Engineering, Purdue University, West Lafayette, IN, United States.

出版信息

Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020.

DOI:10.3389/fnins.2020.00119
PMID:32180697
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7059737/
Abstract

Spiking Neural Networks (SNNs) have recently emerged as a prominent neural computing paradigm. However, the typical shallow SNN architectures have limited capacity for expressing complex representations while training deep SNNs using input spikes has not been successful so far. Diverse methods have been proposed to get around this issue such as converting off-the-shelf trained deep Artificial Neural Networks (ANNs) to SNNs. However, the ANN-SNN conversion scheme fails to capture the temporal dynamics of a spiking system. On the other hand, it is still a difficult problem to directly train deep SNNs using input spike events due to the discontinuous, non-differentiable nature of the spike generation function. To overcome this problem, we propose an approximate derivative method that accounts for the leaky behavior of LIF neurons. This method enables training deep convolutional SNNs directly (with input spike events) using spike-based backpropagation. Our experiments show the effectiveness of the proposed spike-based learning on deep networks (VGG and Residual architectures) by achieving the best classification accuracies in MNIST, SVHN, and CIFAR-10 datasets compared to other SNNs trained with a spike-based learning. Moreover, we analyze sparse event-based computations to demonstrate the efficacy of the proposed SNN training method for inference operation in the spiking domain.

摘要

脉冲神经网络(SNNs)最近已成为一种突出的神经计算范式。然而,典型的浅层SNN架构在表达复杂表征方面能力有限,而到目前为止,使用输入脉冲训练深度SNN尚未成功。人们提出了各种方法来解决这个问题,比如将现成的经过训练的深度人工神经网络(ANNs)转换为SNN。然而,ANN-SNN转换方案无法捕捉脉冲系统的时间动态。另一方面,由于脉冲生成函数的不连续、不可微性质,直接使用输入脉冲事件训练深度SNN仍然是一个难题。为克服这个问题,我们提出一种考虑了LIF神经元泄漏行为的近似导数方法。该方法能够使用基于脉冲的反向传播直接(利用输入脉冲事件)训练深度卷积SNN。我们的实验表明,与其他使用基于脉冲的学习方法训练的SNN相比,所提出的基于脉冲的学习方法在深度网络(VGG和残差架构)上取得了MNIST、SVHN和CIFAR-10数据集中最佳的分类准确率,从而证明了其有效性。此外,我们分析基于稀疏事件的计算,以证明所提出的SNN训练方法在脉冲域推理操作中的有效性。

相似文献

1
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.实现基于尖峰的反向传播以训练深度神经网络架构。
Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020.
2
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
3
Exploring Optimized Spiking Neural Network Architectures for Classification Tasks on Embedded Platforms.探索用于嵌入式平台上分类任务的优化尖峰神经网络架构。
Sensors (Basel). 2021 May 7;21(9):3240. doi: 10.3390/s21093240.
4
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
5
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing.深度尖峰神经网络在动态视觉传感中的优化。
Neural Netw. 2021 Dec;144:686-698. doi: 10.1016/j.neunet.2021.09.022. Epub 2021 Oct 5.
6
Training Deep Spiking Neural Networks Using Backpropagation.使用反向传播训练深度脉冲神经网络。
Front Neurosci. 2016 Nov 8;10:508. doi: 10.3389/fnins.2016.00508. eCollection 2016.
7
High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron.使用量化感知训练框架和钙门控双极泄漏积分发放神经元实现高精度深度人工神经网络到脉冲神经网络的转换。
Front Neurosci. 2023 Mar 8;17:1141701. doi: 10.3389/fnins.2023.1141701. eCollection 2023.
8
A Tandem Learning Rule for Effective Training and Rapid Inference of Deep Spiking Neural Networks.一种用于深度脉冲神经网络有效训练和快速推理的串联学习规则。
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):446-460. doi: 10.1109/TNNLS.2021.3095724. Epub 2023 Jan 5.
9
DIET-SNN: A Low-Latency Spiking Neural Network With Direct Input Encoding and Leakage and Threshold Optimization.DIET-SNN:一种具有直接输入编码以及泄漏和阈值优化的低延迟脉冲神经网络。
IEEE Trans Neural Netw Learn Syst. 2023 Jun;34(6):3174-3182. doi: 10.1109/TNNLS.2021.3111897. Epub 2023 Jun 1.
10
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning.通过基于STDP的无监督预训练和监督微调来训练深度脉冲卷积神经网络
Front Neurosci. 2018 Aug 3;12:435. doi: 10.3389/fnins.2018.00435. eCollection 2018.

引用本文的文献

1
Neuromorphic computing for robotic vision: algorithms to hardware advances.用于机器人视觉的神经形态计算:从算法到硬件的进展
Commun Eng. 2025 Aug 13;4(1):152. doi: 10.1038/s44172-025-00492-5.
2
SpyKing-Privacy-preserving framework for Spiking Neural Networks.用于脉冲神经网络的SpyKing隐私保护框架。
Front Neurosci. 2025 May 30;19:1551143. doi: 10.3389/fnins.2025.1551143. eCollection 2025.
3
Engineered biological neuronal networks as basic logic operators.作为基本逻辑运算符的工程化生物神经网络。

本文引用的文献

1
A solution to the learning dilemma for recurrent networks of spiking neurons.用于尖峰神经元递归网络的学习困境的解决方案。
Nat Commun. 2020 Jul 17;11(1):3625. doi: 10.1038/s41467-020-17236-y.
2
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures.深入探索脉冲神经网络:VGG和残差架构。
Front Neurosci. 2019 Mar 7;13:95. doi: 10.3389/fnins.2019.00095. eCollection 2019.
3
Backpropagation through time and the brain.时间反向传播与大脑。
Front Comput Neurosci. 2025 Apr 28;19:1559936. doi: 10.3389/fncom.2025.1559936. eCollection 2025.
4
Encrypted Spiking Neural Networks Based on Adaptive Differential Privacy Mechanism.基于自适应差分隐私机制的加密脉冲神经网络
Entropy (Basel). 2025 Mar 22;27(4):333. doi: 10.3390/e27040333.
5
An accurate and fast learning approach in the biologically spiking neural network.生物脉冲神经网络中一种准确且快速的学习方法。
Sci Rep. 2025 Feb 24;15(1):6585. doi: 10.1038/s41598-025-90113-0.
6
Fourier or Wavelet bases as counterpart self-attention in spikformer for efficient visual classification.傅里叶或小波基作为Spikformer中对应自注意力机制用于高效视觉分类。
Front Neurosci. 2025 Jan 29;18:1516868. doi: 10.3389/fnins.2024.1516868. eCollection 2024.
7
Memristive leaky integrate-and-fire neuron and learnable straight-through estimator in spiking neural networks.忆阻型漏电积分发放神经元与脉冲神经网络中的可学习直通估计器
Cogn Neurodyn. 2024 Oct;18(5):3075-3091. doi: 10.1007/s11571-024-10133-w. Epub 2024 Jun 20.
8
Brain-Inspired Architecture for Spiking Neural Networks.用于脉冲神经网络的受脑启发架构
Biomimetics (Basel). 2024 Oct 21;9(10):646. doi: 10.3390/biomimetics9100646.
9
Direct training high-performance deep spiking neural networks: a review of theories and methods.直接训练高性能深度脉冲神经网络:理论与方法综述
Front Neurosci. 2024 Jul 31;18:1383844. doi: 10.3389/fnins.2024.1383844. eCollection 2024.
10
Auto-Spikformer: Spikformer architecture search.自动Spikformer:Spikformer架构搜索
Front Neurosci. 2024 Jul 23;18:1372257. doi: 10.3389/fnins.2024.1372257. eCollection 2024.
Curr Opin Neurobiol. 2019 Apr;55:82-89. doi: 10.1016/j.conb.2019.01.011. Epub 2019 Mar 7.
4
SpiLinC: Spiking Liquid-Ensemble Computing for Unsupervised Speech and Image Recognition.SpiLinC:用于无监督语音和图像识别的脉冲液体集成计算
Front Neurosci. 2018 Aug 23;12:524. doi: 10.3389/fnins.2018.00524. eCollection 2018.
5
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning.通过基于STDP的无监督预训练和监督微调来训练深度脉冲卷积神经网络
Front Neurosci. 2018 Aug 3;12:435. doi: 10.3389/fnins.2018.00435. eCollection 2018.
6
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
7
A Dynamic Connectome Supports the Emergence of Stable Computational Function of Neural Circuits through Reward-Based Learning.动态连接组支持基于奖励的学习使神经网络电路的稳定计算功能得以出现。
eNeuro. 2018 Apr 24;5(2). doi: 10.1523/ENEURO.0301-17.2018. eCollection 2018 Mar-Apr.
8
Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification.将连续值深度网络转换为用于图像分类的高效事件驱动网络
Front Neurosci. 2017 Dec 7;11:682. doi: 10.3389/fnins.2017.00682. eCollection 2017.
9
STDP-based spiking deep convolutional neural networks for object recognition.基于 STDP 的尖峰深度卷积神经网络的目标识别。
Neural Netw. 2018 Mar;99:56-67. doi: 10.1016/j.neunet.2017.12.005. Epub 2017 Dec 23.
10
Supervised Learning Based on Temporal Coding in Spiking Neural Networks.基于脉冲神经网络中时间编码的监督学习。
IEEE Trans Neural Netw Learn Syst. 2018 Jul;29(7):3227-3235. doi: 10.1109/TNNLS.2017.2726060. Epub 2017 Aug 1.