• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

SpikeProp学习中对训练干扰的鲁棒性

Robustness to Training Disturbances in SpikeProp Learning.

作者信息

Shrestha Sumit Bam, Song Qing

出版信息

IEEE Trans Neural Netw Learn Syst. 2018 Jul;29(7):3126-3139. doi: 10.1109/TNNLS.2017.2713125. Epub 2017 Jul 4.

DOI:10.1109/TNNLS.2017.2713125
PMID:28692992
Abstract

Stability is a key issue during spiking neural network training using SpikeProp. The inherent nonlinearity of Spiking Neuron means that the learning manifold changes abruptly; therefore, we need to carefully choose the learning steps at every instance. Other sources of instability are the external disturbances that come along with training sample as well as the internal disturbances that arise due to modeling imperfection. The unstable learning scenario can be indirectly observed in the form of surges, which are sudden increases in the learning cost and are a common occurrence during SpikeProp training. Research in the past has shown that proper learning step size is crucial to minimize surges during training process. To determine proper learning step in order to avoid steep learning manifolds, we perform weight convergence analysis of SpikeProp learning in the presence of disturbance signals. The weight convergence analysis is further extended to robust stability analysis linked with overall system error. This ensures boundedness of the total learning error with minimal assumption of bounded disturbance signals. These analyses result in the learning rate normalization scheme, which are the key results of this paper. The performance of learning using this scheme has been compared with the prevailing methods for different benchmark data sets and the results show that this method has stable learning reflected by minimal surges during learning, higher success in training instances, and faster learning as well.

摘要

在使用SpikeProp进行脉冲神经网络训练期间,稳定性是一个关键问题。脉冲神经元固有的非线性意味着学习流形会突然变化;因此,我们需要在每个实例中仔细选择学习步长。其他不稳定来源包括与训练样本一起出现的外部干扰以及由于建模不完善而产生的内部干扰。不稳定的学习情况可以以激增的形式间接观察到,激增是学习成本的突然增加,在SpikeProp训练期间很常见。过去的研究表明,合适的学习步长对于在训练过程中最小化激增至关重要。为了确定合适的学习步长以避免陡峭的学习流形,我们在存在干扰信号的情况下对SpikeProp学习进行权重收敛分析。权重收敛分析进一步扩展到与整体系统误差相关的鲁棒稳定性分析。这在对有界干扰信号的假设最小的情况下确保了总学习误差的有界性。这些分析得出了学习率归一化方案,这是本文的关键结果。使用该方案进行学习的性能已与针对不同基准数据集的现有方法进行了比较,结果表明该方法具有稳定的学习性能,表现为学习期间激增最小、训练实例成功率更高以及学习速度更快。

相似文献

1
Robustness to Training Disturbances in SpikeProp Learning.SpikeProp学习中对训练干扰的鲁棒性
IEEE Trans Neural Netw Learn Syst. 2018 Jul;29(7):3126-3139. doi: 10.1109/TNNLS.2017.2713125. Epub 2017 Jul 4.
2
Robust learning in SpikeProp.SpikeProp 中的鲁棒学习。
Neural Netw. 2017 Feb;86:54-68. doi: 10.1016/j.neunet.2016.10.011. Epub 2016 Nov 8.
3
Adaptive learning rate of SpikeProp based on weight convergence analysis.基于权重收敛分析的 SpikeProp 自适应学习率。
Neural Netw. 2015 Mar;63:185-98. doi: 10.1016/j.neunet.2014.12.001. Epub 2014 Dec 10.
4
Spiking Neural Network Regularization With Fixed and Adaptive Drop-Keep Probabilities.使用固定和自适应的 Drop-Keep 概率进行尖峰神经网络正则化。
IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):4096-4109. doi: 10.1109/TNNLS.2021.3055825. Epub 2022 Aug 3.
5
On the Analytical Solution of Firing Time for SpikeProp.关于SpikeProp点火时间的解析解
Neural Comput. 2016 Nov;28(11):2461-2473. doi: 10.1162/NECO_a_00884. Epub 2016 Aug 24.
6
Robust spike-train learning in spike-event based weight update.基于事件的权重更新的鲁棒尖峰脉冲列学习。
Neural Netw. 2017 Dec;96:33-46. doi: 10.1016/j.neunet.2017.08.010. Epub 2017 Sep 12.
7
The convergence analysis of SpikeProp algorithm with smoothing L regularization.SpikeProp 算法与平滑 L 正则化的收敛性分析。
Neural Netw. 2018 Jul;103:19-28. doi: 10.1016/j.neunet.2018.03.007. Epub 2018 Mar 16.
8
A new supervised learning algorithm for multiple spiking neural networks with application in epilepsy and seizure detection.一种新的用于多尖峰神经网络的监督学习算法及其在癫痫和发作检测中的应用。
Neural Netw. 2009 Dec;22(10):1419-31. doi: 10.1016/j.neunet.2009.04.003. Epub 2009 Apr 22.
9
On-line learning of dynamical systems in the presence of model mismatch and disturbances.存在模型失配和干扰情况下动态系统的在线学习
IEEE Trans Neural Netw. 2000;11(6):1272-83. doi: 10.1109/72.883420.
10
Spike-timing error backpropagation in theta neuron networks.θ神经元网络中的尖峰时间误差反向传播
Neural Comput. 2009 Jan;21(1):9-45. doi: 10.1162/neco.2008.09-07-610.

引用本文的文献

1
Rethinking skip connections in Spiking Neural Networks with Time-To-First-Spike coding.基于首次放电时间编码对脉冲神经网络中的跳跃连接进行重新思考。
Front Neurosci. 2024 Feb 14;18:1346805. doi: 10.3389/fnins.2024.1346805. eCollection 2024.
2
NeuroCARE: A generic neuromorphic edge computing framework for healthcare applications.NeuroCARE:一种用于医疗保健应用的通用神经形态边缘计算框架。
Front Neurosci. 2023 Jan 23;17:1093865. doi: 10.3389/fnins.2023.1093865. eCollection 2023.
3
Supervised Learning Algorithm for Multilayer Spiking Neural Networks with Long-Term Memory Spike Response Model.
监督学习算法在具有长时记忆尖峰响应模型的多层尖峰神经网络中的应用。
Comput Intell Neurosci. 2021 Nov 24;2021:8592824. doi: 10.1155/2021/8592824. eCollection 2021.