• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过精确的突触效率调整方法提高监督尖峰神经网络的训练效率。

Efficient Training of Supervised Spiking Neural Network via Accurate Synaptic-Efficiency Adjustment Method.

出版信息

IEEE Trans Neural Netw Learn Syst. 2017 Jun;28(6):1411-1424. doi: 10.1109/TNNLS.2016.2541339. Epub 2016 Mar 30.

DOI:10.1109/TNNLS.2016.2541339
PMID:28113824
Abstract

The spiking neural network (SNN) is the third generation of neural networks and performs remarkably well in cognitive tasks, such as pattern recognition. The temporal neural encode mechanism found in biological hippocampus enables SNN to possess more powerful computation capability than networks with other encoding schemes. However, this temporal encoding approach requires neurons to process information serially on time, which reduces learning efficiency significantly. To keep the powerful computation capability of the temporal encoding mechanism and to overcome its low efficiency in the training of SNNs, a new training algorithm, the accurate synaptic-efficiency adjustment method is proposed in this paper. Inspired by the selective attention mechanism of the primate visual system, our algorithm selects only the target spike time as attention areas, and ignores voltage states of the untarget ones, resulting in a significant reduction of training time. Besides, our algorithm employs a cost function based on the voltage difference between the potential of the output neuron and the firing threshold of the SNN, instead of the traditional precise firing time distance. A normalized spike-timing-dependent-plasticity learning window is applied to assigning this error to different synapses for instructing their training. Comprehensive simulations are conducted to investigate the learning properties of our algorithm, with input neurons emitting both single spike and multiple spikes. Simulation results indicate that our algorithm possesses higher learning performance than the existing other methods and achieves the state-of-the-art efficiency in the training of SNN.

摘要

尖峰神经网络 (SNN) 是第三代神经网络,在认知任务(如模式识别)中表现出色。生物海马体中发现的时间神经编码机制使 SNN 具有比其他编码方案的网络更强大的计算能力。然而,这种时间编码方法要求神经元按时间顺序串行地处理信息,这大大降低了学习效率。为了保持时间编码机制的强大计算能力,并克服 SNN 训练中的低效率,本文提出了一种新的训练算法,即精确突触效率调整方法。受灵长类动物视觉系统选择性注意机制的启发,我们的算法仅选择目标尖峰时间作为注意区域,忽略非目标的电压状态,从而显著减少了训练时间。此外,我们的算法采用基于输出神经元的势能与 SNN 的点火阈值之间的电压差的代价函数,而不是传统的精确点火时间距离。应用归一化的尖峰时依赖可塑性学习窗口将此误差分配给不同的突触,以指导它们的训练。进行了全面的仿真,以研究我们算法的学习特性,输入神经元发出单个尖峰和多个尖峰。仿真结果表明,我们的算法比现有的其他方法具有更高的学习性能,并在 SNN 的训练中达到了最新的效率。

相似文献

1
Efficient Training of Supervised Spiking Neural Network via Accurate Synaptic-Efficiency Adjustment Method.通过精确的突触效率调整方法提高监督尖峰神经网络的训练效率。
IEEE Trans Neural Netw Learn Syst. 2017 Jun;28(6):1411-1424. doi: 10.1109/TNNLS.2016.2541339. Epub 2016 Mar 30.
2
An Efficient Supervised Training Algorithm for Multilayer Spiking Neural Networks.一种用于多层脉冲神经网络的高效监督训练算法。
PLoS One. 2016 Apr 4;11(4):e0150329. doi: 10.1371/journal.pone.0150329. eCollection 2016.
3
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
4
Competitive Learning in a Spiking Neural Network: Towards an Intelligent Pattern Classifier.尖峰神经网络中的竞争学习:迈向智能模式分类器。
Sensors (Basel). 2020 Jan 16;20(2):500. doi: 10.3390/s20020500.
5
A Supervised Learning Algorithm for Learning Precise Timing of Multiple Spikes in Multilayer Spiking Neural Networks.一种用于学习多层脉冲神经网络中多个脉冲精确时间的监督学习算法。
IEEE Trans Neural Netw Learn Syst. 2018 Nov;29(11):5394-5407. doi: 10.1109/TNNLS.2018.2797801. Epub 2018 Mar 1.
6
First Error-Based Supervised Learning Algorithm for Spiking Neural Networks.用于脉冲神经网络的首个基于误差的监督学习算法。
Front Neurosci. 2019 Jun 6;13:559. doi: 10.3389/fnins.2019.00559. eCollection 2019.
7
A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks.基于梯度下降的监督多尖峰学习算法在尖峰神经网络中的应用。
Neural Netw. 2013 Jul;43:99-113. doi: 10.1016/j.neunet.2013.02.003. Epub 2013 Feb 16.
8
A biologically plausible supervised learning method for spiking neural networks using the symmetric STDP rule.基于对称 STDP 规则的尖峰神经网络的生物合理有监督学习方法。
Neural Netw. 2020 Jan;121:387-395. doi: 10.1016/j.neunet.2019.09.007. Epub 2019 Sep 27.
9
A Highly Effective and Robust Membrane Potential-Driven Supervised Learning Method for Spiking Neurons.一种高效稳健的基于膜电位的尖峰神经元监督学习方法。
IEEE Trans Neural Netw Learn Syst. 2019 Jan;30(1):123-137. doi: 10.1109/TNNLS.2018.2833077. Epub 2018 May 28.
10
Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition.用于在线时空谱模式识别的动态进化尖峰神经网络。
Neural Netw. 2013 May;41:188-201. doi: 10.1016/j.neunet.2012.11.014. Epub 2012 Dec 20.

引用本文的文献

1
Forced Oscillation Detection via a Hybrid Network of a Spiking Recurrent Neural Network and LSTM.通过脉冲递归神经网络和长短期记忆网络的混合网络进行强迫振荡检测
Sensors (Basel). 2025 Apr 20;25(8):2607. doi: 10.3390/s25082607.
2
Brain-Inspired Architecture for Spiking Neural Networks.用于脉冲神经网络的受脑启发架构
Biomimetics (Basel). 2024 Oct 21;9(10):646. doi: 10.3390/biomimetics9100646.
3
STSC-SNN: Spatio-Temporal Synaptic Connection with temporal convolution and attention for spiking neural networks.STSC-SNN:用于脉冲神经网络的具有时间卷积和注意力机制的时空突触连接
Front Neurosci. 2022 Dec 23;16:1079357. doi: 10.3389/fnins.2022.1079357. eCollection 2022.
4
ALSA: Associative Learning Based Supervised Learning Algorithm for SNN.ALSA:用于脉冲神经网络的基于关联学习的监督学习算法
Front Neurosci. 2022 Mar 31;16:838832. doi: 10.3389/fnins.2022.838832. eCollection 2022.
5
Unsupervised speech recognition through spike-timing-dependent plasticity in a convolutional spiking neural network.基于卷积尖峰神经网络中的尖峰时间依赖可塑性的无监督语音识别。
PLoS One. 2018 Nov 29;13(11):e0204596. doi: 10.1371/journal.pone.0204596. eCollection 2018.