Suppr超能文献

针对认知任务的尖峰神经网络训练:一个与各种时间编码兼容的通用框架。

Training Spiking Neural Networks for Cognitive Tasks: A Versatile Framework Compatible With Various Temporal Codes.

出版信息

IEEE Trans Neural Netw Learn Syst. 2020 Apr;31(4):1285-1296. doi: 10.1109/TNNLS.2019.2919662. Epub 2019 Jun 21.

Abstract

Recent studies have demonstrated the effectiveness of supervised learning in spiking neural networks (SNNs). A trainable SNN provides a valuable tool not only for engineering applications but also for theoretical neuroscience studies. Here, we propose a modified SpikeProp learning algorithm, which ensures better learning stability for SNNs and provides more diverse network structures and coding schemes. Specifically, we designed a spike gradient threshold rule to solve the well-known gradient exploding problem in SNN training. In addition, regulation rules on firing rates and connection weights are proposed to control the network activity during training. Based on these rules, biologically realistic features such as lateral connections, complex synaptic dynamics, and sparse activities are included in the network to facilitate neural computation. We demonstrate the versatility of this framework by implementing three well-known temporal codes for different types of cognitive tasks, namely, handwritten digit recognition, spatial coordinate transformation, and motor sequence generation. Several important features observed in experimental studies, such as selective activity, excitatory-inhibitory balance, and weak pairwise correlation, emerged in the trained model. This agreement between experimental and computational results further confirmed the importance of these features in neural function. This work provides a new framework, in which various neural behaviors can be modeled and the underlying computational mechanisms can be studied.

摘要

最近的研究表明,监督学习在尖峰神经网络(SNN)中是有效的。可训练的 SNN 不仅为工程应用提供了有价值的工具,也为理论神经科学研究提供了有价值的工具。在这里,我们提出了一种改进的 SpikeProp 学习算法,该算法确保了 SNN 的更好的学习稳定性,并提供了更多样化的网络结构和编码方案。具体来说,我们设计了一个尖峰梯度阈值规则来解决 SNN 训练中著名的梯度爆炸问题。此外,还提出了关于发放率和连接权重的调节规则,以控制训练过程中的网络活动。基于这些规则,网络中包含了侧向连接、复杂的突触动力学和稀疏活动等生物现实特征,以促进神经计算。我们通过实现三种用于不同类型认知任务的著名时间编码,即手写数字识别、空间坐标变换和运动序列生成,展示了该框架的多功能性。在训练模型中观察到了实验研究中的几个重要特征,如选择性活动、兴奋抑制平衡和弱成对相关性。这些实验和计算结果之间的一致性进一步证实了这些特征在神经功能中的重要性。这项工作提供了一个新的框架,可以在其中模拟各种神经行为,并研究其潜在的计算机制。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验