• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过时间上截断的局部反向传播对脉冲神经网络进行高效训练。

Efficient training of spiking neural networks with temporally-truncated local backpropagation through time.

作者信息

Guo Wenzhe, Fouda Mohammed E, Eltawil Ahmed M, Salama Khaled Nabil

机构信息

Sensors Lab, Advanced Membranes and Porous Materials Center (AMPMC), Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division, King Abdullah University of Science and Technology, Thuwal, Saudi Arabia.

Communication and Computing Systems Lab, Computer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division, King Abdullah University of Science and Technology, Thuwal, Saudi Arabia.

出版信息

Front Neurosci. 2023 Apr 6;17:1047008. doi: 10.3389/fnins.2023.1047008. eCollection 2023.

DOI:10.3389/fnins.2023.1047008
PMID:37090791
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10117667/
Abstract

Directly training spiking neural networks (SNNs) has remained challenging due to complex neural dynamics and intrinsic non-differentiability in firing functions. The well-known backpropagation through time (BPTT) algorithm proposed to train SNNs suffers from large memory footprint and prohibits backward and update unlocking, making it impossible to exploit the potential of locally-supervised training methods. This work proposes an efficient and direct training algorithm for SNNs that integrates a locally-supervised training method with a temporally-truncated BPTT algorithm. The proposed algorithm explores both temporal and spatial locality in BPTT and contributes to significant reduction in computational cost including GPU memory utilization, main memory access and arithmetic operations. We thoroughly explore the design space concerning temporal truncation length and local training block size and benchmark their impact on classification accuracy of different networks running different types of tasks. The results reveal that temporal truncation has a negative effect on the accuracy of classifying frame-based datasets, but leads to improvement in accuracy on event-based datasets. In spite of resulting information loss, local training is capable of alleviating overfitting. The combined effect of temporal truncation and local training can lead to the slowdown of accuracy drop and even improvement in accuracy. In addition, training deep SNNs' models such as AlexNet classifying CIFAR10-DVS dataset leads to 7.26% increase in accuracy, 89.94% reduction in GPU memory, 10.79% reduction in memory access, and 99.64% reduction in MAC operations compared to the standard end-to-end BPTT. Thus, the proposed method has shown high potential to enable fast and energy-efficient on-chip training for real-time learning at the edge.

摘要

由于复杂的神经动力学和激发函数中固有的不可微性,直接训练脉冲神经网络(SNN)仍然具有挑战性。为训练SNN而提出的著名的时间反向传播(BPTT)算法存在内存占用大的问题,并且禁止反向传播和更新解锁,这使得无法利用局部监督训练方法的潜力。这项工作提出了一种高效的SNN直接训练算法,该算法将局部监督训练方法与时间截断的BPTT算法相结合。所提出的算法在BPTT中探索了时间和空间局部性,并有助于显著降低计算成本,包括GPU内存利用率、主内存访问和算术运算。我们全面探索了关于时间截断长度和局部训练块大小的设计空间,并对它们对运行不同类型任务的不同网络的分类准确率的影响进行了基准测试。结果表明,时间截断对基于帧的数据集的分类准确率有负面影响,但会提高基于事件的数据集的准确率。尽管会导致信息丢失,但局部训练能够减轻过拟合。时间截断和局部训练的综合效果可以导致准确率下降放缓,甚至准确率提高。此外,与标准的端到端BPTT相比,训练深度SNN模型(如对CIFAR10-DVS数据集进行分类的AlexNet)可使准确率提高7.26%,GPU内存减少89.94%,内存访问减少10.79%,乘法累加运算减少99.64%。因此,所提出的方法在实现边缘实时学习的快速且节能的片上训练方面显示出了很高的潜力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/522c/10117667/45f62519c757/fnins-17-1047008-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/522c/10117667/e267dc90c3f9/fnins-17-1047008-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/522c/10117667/45f62519c757/fnins-17-1047008-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/522c/10117667/e267dc90c3f9/fnins-17-1047008-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/522c/10117667/45f62519c757/fnins-17-1047008-g0003.jpg

相似文献

1
Efficient training of spiking neural networks with temporally-truncated local backpropagation through time.通过时间上截断的局部反向传播对脉冲神经网络进行高效训练。
Front Neurosci. 2023 Apr 6;17:1047008. doi: 10.3389/fnins.2023.1047008. eCollection 2023.
2
Comparing SNNs and RNNs on neuromorphic vision datasets: Similarities and differences.在神经形态视觉数据集上比较 SNNs 和 RNNs:相似性和差异。
Neural Netw. 2020 Dec;132:108-120. doi: 10.1016/j.neunet.2020.08.001. Epub 2020 Aug 17.
3
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
4
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
5
EXODUS: Stable and efficient training of spiking neural networks.《出埃及记》:脉冲神经网络的稳定高效训练
Front Neurosci. 2023 Feb 8;17:1110444. doi: 10.3389/fnins.2023.1110444. eCollection 2023.
6
Spike-Train Level Direct Feedback Alignment: Sidestepping Backpropagation for On-Chip Training of Spiking Neural Nets.尖峰序列水平直接反馈对齐:用于脉冲神经网络片上训练的避开反向传播方法
Front Neurosci. 2020 Mar 13;14:143. doi: 10.3389/fnins.2020.00143. eCollection 2020.
7
Braille letter reading: A benchmark for spatio-temporal pattern recognition on neuromorphic hardware.盲文阅读:神经形态硬件上时空模式识别的一个基准。
Front Neurosci. 2022 Nov 11;16:951164. doi: 10.3389/fnins.2022.951164. eCollection 2022.
8
Deep Learning With Spiking Neurons: Opportunities and Challenges.基于脉冲神经元的深度学习:机遇与挑战。
Front Neurosci. 2018 Oct 25;12:774. doi: 10.3389/fnins.2018.00774. eCollection 2018.
9
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing.深度尖峰神经网络在动态视觉传感中的优化。
Neural Netw. 2021 Dec;144:686-698. doi: 10.1016/j.neunet.2021.09.022. Epub 2021 Oct 5.
10
On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices.使用带有模拟突触器件的近似反向传播的片上训练脉冲神经网络。
Front Neurosci. 2020 Jul 7;14:423. doi: 10.3389/fnins.2020.00423. eCollection 2020.

引用本文的文献

1
ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator.ALBSNN:具有精度损失估计器的超低延迟自适应局部二值脉冲神经网络
Front Neurosci. 2023 Sep 13;17:1225871. doi: 10.3389/fnins.2023.1225871. eCollection 2023.

本文引用的文献

1
Training much deeper spiking neural networks with a small number of time-steps.用少量时间步训练更深的尖峰神经网络。
Neural Netw. 2022 Sep;153:254-268. doi: 10.1016/j.neunet.2022.06.001. Epub 2022 Jun 15.
2
ES-ImageNet: A Million Event-Stream Classification Dataset for Spiking Neural Networks.ES-ImageNet:用于脉冲神经网络的百万事件流分类数据集。
Front Neurosci. 2021 Nov 25;15:726582. doi: 10.3389/fnins.2021.726582. eCollection 2021.
3
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.
SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
4
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing.深度尖峰神经网络在动态视觉传感中的优化。
Neural Netw. 2021 Dec;144:686-698. doi: 10.1016/j.neunet.2021.09.022. Epub 2021 Oct 5.
5
A Tandem Learning Rule for Effective Training and Rapid Inference of Deep Spiking Neural Networks.一种用于深度脉冲神经网络有效训练和快速推理的串联学习规则。
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):446-460. doi: 10.1109/TNNLS.2021.3095724. Epub 2023 Jan 5.
6
A Survey of Convolutional Neural Networks: Analysis, Applications, and Prospects.卷积神经网络综述:分析、应用与展望
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):6999-7019. doi: 10.1109/TNNLS.2021.3084827. Epub 2022 Nov 30.
7
Toward the Optimal Design and FPGA Implementation of Spiking Neural Networks.迈向最优设计和 FPGA 实现的尖峰神经网络。
IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3988-4002. doi: 10.1109/TNNLS.2021.3055421. Epub 2022 Aug 3.
8
Temporal Encoding and Multispike Learning Framework for Efficient Recognition of Visual Patterns.用于视觉模式高效识别的时间编码与多尖峰学习框架。
IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3387-3399. doi: 10.1109/TNNLS.2021.3052804. Epub 2022 Aug 3.
9
Comparing SNNs and RNNs on neuromorphic vision datasets: Similarities and differences.在神经形态视觉数据集上比较 SNNs 和 RNNs:相似性和差异。
Neural Netw. 2020 Dec;132:108-120. doi: 10.1016/j.neunet.2020.08.001. Epub 2020 Aug 17.
10
Synaptic Plasticity Dynamics for Deep Continuous Local Learning (DECOLLE).深度连续局部学习(DECOLLE)的突触可塑性动力学
Front Neurosci. 2020 May 12;14:424. doi: 10.3389/fnins.2020.00424. eCollection 2020.