• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

针对认知任务的尖峰神经网络训练:一个与各种时间编码兼容的通用框架。

Training Spiking Neural Networks for Cognitive Tasks: A Versatile Framework Compatible With Various Temporal Codes.

出版信息

IEEE Trans Neural Netw Learn Syst. 2020 Apr;31(4):1285-1296. doi: 10.1109/TNNLS.2019.2919662. Epub 2019 Jun 21.

DOI:10.1109/TNNLS.2019.2919662
PMID:31247574
Abstract

Recent studies have demonstrated the effectiveness of supervised learning in spiking neural networks (SNNs). A trainable SNN provides a valuable tool not only for engineering applications but also for theoretical neuroscience studies. Here, we propose a modified SpikeProp learning algorithm, which ensures better learning stability for SNNs and provides more diverse network structures and coding schemes. Specifically, we designed a spike gradient threshold rule to solve the well-known gradient exploding problem in SNN training. In addition, regulation rules on firing rates and connection weights are proposed to control the network activity during training. Based on these rules, biologically realistic features such as lateral connections, complex synaptic dynamics, and sparse activities are included in the network to facilitate neural computation. We demonstrate the versatility of this framework by implementing three well-known temporal codes for different types of cognitive tasks, namely, handwritten digit recognition, spatial coordinate transformation, and motor sequence generation. Several important features observed in experimental studies, such as selective activity, excitatory-inhibitory balance, and weak pairwise correlation, emerged in the trained model. This agreement between experimental and computational results further confirmed the importance of these features in neural function. This work provides a new framework, in which various neural behaviors can be modeled and the underlying computational mechanisms can be studied.

摘要

最近的研究表明,监督学习在尖峰神经网络(SNN)中是有效的。可训练的 SNN 不仅为工程应用提供了有价值的工具,也为理论神经科学研究提供了有价值的工具。在这里,我们提出了一种改进的 SpikeProp 学习算法,该算法确保了 SNN 的更好的学习稳定性,并提供了更多样化的网络结构和编码方案。具体来说,我们设计了一个尖峰梯度阈值规则来解决 SNN 训练中著名的梯度爆炸问题。此外,还提出了关于发放率和连接权重的调节规则,以控制训练过程中的网络活动。基于这些规则,网络中包含了侧向连接、复杂的突触动力学和稀疏活动等生物现实特征,以促进神经计算。我们通过实现三种用于不同类型认知任务的著名时间编码,即手写数字识别、空间坐标变换和运动序列生成,展示了该框架的多功能性。在训练模型中观察到了实验研究中的几个重要特征,如选择性活动、兴奋抑制平衡和弱成对相关性。这些实验和计算结果之间的一致性进一步证实了这些特征在神经功能中的重要性。这项工作提供了一个新的框架,可以在其中模拟各种神经行为,并研究其潜在的计算机制。

相似文献

1
Training Spiking Neural Networks for Cognitive Tasks: A Versatile Framework Compatible With Various Temporal Codes.针对认知任务的尖峰神经网络训练:一个与各种时间编码兼容的通用框架。
IEEE Trans Neural Netw Learn Syst. 2020 Apr;31(4):1285-1296. doi: 10.1109/TNNLS.2019.2919662. Epub 2019 Jun 21.
2
A biologically plausible supervised learning method for spiking neural networks using the symmetric STDP rule.基于对称 STDP 规则的尖峰神经网络的生物合理有监督学习方法。
Neural Netw. 2020 Jan;121:387-395. doi: 10.1016/j.neunet.2019.09.007. Epub 2019 Sep 27.
3
HybridSNN: Combining Bio-Machine Strengths by Boosting Adaptive Spiking Neural Networks.HybridSNN:通过提升自适应尖峰神经网络来结合生物机器的优势。
IEEE Trans Neural Netw Learn Syst. 2023 Sep;34(9):5841-5855. doi: 10.1109/TNNLS.2021.3131356. Epub 2023 Sep 1.
4
Competitive Learning in a Spiking Neural Network: Towards an Intelligent Pattern Classifier.尖峰神经网络中的竞争学习:迈向智能模式分类器。
Sensors (Basel). 2020 Jan 16;20(2):500. doi: 10.3390/s20020500.
5
Supervised Learning in Multilayer Spiking Neural Networks With Spike Temporal Error Backpropagation.监督学习在基于尖峰时间误差反向传播的多层尖峰神经网络中的应用。
IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):10141-10153. doi: 10.1109/TNNLS.2022.3164930. Epub 2023 Nov 30.
6
An unsupervised STDP-based spiking neural network inspired by biologically plausible learning rules and connections.一种基于无监督 STDP 的尖峰神经网络,灵感来自于具有生物学合理性的学习规则和连接。
Neural Netw. 2023 Aug;165:799-808. doi: 10.1016/j.neunet.2023.06.019. Epub 2023 Jun 22.
7
Tuning Convolutional Spiking Neural Network With Biologically Plausible Reward Propagation.基于生物合理奖励传播的卷积脉冲神经网络调优
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7621-7631. doi: 10.1109/TNNLS.2021.3085966. Epub 2022 Nov 30.
8
Mirrored STDP Implements Autoencoder Learning in a Network of Spiking Neurons.镜像脉冲时间依赖可塑性在脉冲神经元网络中实现自动编码器学习。
PLoS Comput Biol. 2015 Dec 3;11(12):e1004566. doi: 10.1371/journal.pcbi.1004566. eCollection 2015 Dec.
9
Deep learning in spiking neural networks.深度学习在尖峰神经网络中的应用。
Neural Netw. 2019 Mar;111:47-63. doi: 10.1016/j.neunet.2018.12.002. Epub 2018 Dec 18.
10
An Efficient Supervised Training Algorithm for Multilayer Spiking Neural Networks.一种用于多层脉冲神经网络的高效监督训练算法。
PLoS One. 2016 Apr 4;11(4):e0150329. doi: 10.1371/journal.pone.0150329. eCollection 2016.

引用本文的文献

1
A Novel Robotic Controller Using Neural Engineering Framework-Based Spiking Neural Networks.一种使用基于神经工程框架的脉冲神经网络的新型机器人控制器。
Sensors (Basel). 2024 Jan 12;24(2):491. doi: 10.3390/s24020491.
2
Learnable axonal delay in spiking neural networks improves spoken word recognition.脉冲神经网络中可学习的轴突延迟改善了口语单词识别。
Front Neurosci. 2023 Nov 9;17:1275944. doi: 10.3389/fnins.2023.1275944. eCollection 2023.
3
Direct learning-based deep spiking neural networks: a review.基于直接学习的深度脉冲神经网络综述
Front Neurosci. 2023 Jun 16;17:1209795. doi: 10.3389/fnins.2023.1209795. eCollection 2023.
4
Supervised Learning Algorithm for Multilayer Spiking Neural Networks with Long-Term Memory Spike Response Model.监督学习算法在具有长时记忆尖峰响应模型的多层尖峰神经网络中的应用。
Comput Intell Neurosci. 2021 Nov 24;2021:8592824. doi: 10.1155/2021/8592824. eCollection 2021.
5
[A review of brain-like spiking neural network and its neuromorphic chip research].[类脑脉冲神经网络及其神经形态芯片研究综述]
Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. 2021 Oct 25;38(5):986-994. doi: 10.7507/1001-5515.202011005.
6
Spiking Autoencoders With Temporal Coding.具有时间编码的脉冲自动编码器
Front Neurosci. 2021 Aug 13;15:712667. doi: 10.3389/fnins.2021.712667. eCollection 2021.