• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于具有首次放电时间编码的脉冲神经网络的稀疏放电正则化方法。

Sparse-firing regularization methods for spiking neural networks with time-to-first-spike coding.

作者信息

Sakemi Yusuke, Yamamoto Kakei, Hosomi Takeo, Aihara Kazuyuki

机构信息

Research Center for Mathematical Engineering, Chiba Institute of Technology, Narashino, Japan.

International Research Center for Neurointelligence (WPI-IRCN), The University of Tokyo, Tokyo, Japan.

出版信息

Sci Rep. 2023 Dec 21;13(1):22897. doi: 10.1038/s41598-023-50201-5.

DOI:10.1038/s41598-023-50201-5
PMID:38129555
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10739753/
Abstract

The training of multilayer spiking neural networks (SNNs) using the error backpropagation algorithm has made significant progress in recent years. Among the various training schemes, the error backpropagation method that directly uses the firing time of neurons has attracted considerable attention because it can realize ideal temporal coding. This method uses time-to-first-spike (TTFS) coding, in which each neuron fires at most once, and this restriction on the number of firings enables information to be processed at a very low firing frequency. This low firing frequency increases the energy efficiency of information processing in SNNs. However, only an upper limit has been provided for TTFS-coded SNNs, and the information-processing capability of SNNs at lower firing frequencies has not been fully investigated. In this paper, we propose two spike-timing-based sparse-firing (SSR) regularization methods to further reduce the firing frequency of TTFS-coded SNNs. Both methods are characterized by the fact that they only require information about the firing timing and associated weights. The effects of these regularization methods were investigated on the MNIST, Fashion-MNIST, and CIFAR-10 datasets using multilayer perceptron networks and convolutional neural network structures.

摘要

近年来,使用误差反向传播算法训练多层脉冲神经网络(SNN)取得了显著进展。在各种训练方案中,直接使用神经元发放时间的误差反向传播方法因其能够实现理想的时间编码而备受关注。该方法采用首次发放时间(TTFS)编码,其中每个神经元最多发放一次,这种对发放次数的限制使得信息能够以非常低的发放频率进行处理。这种低发放频率提高了SNN中信息处理的能量效率。然而,对于基于TTFS编码的SNN仅给出了一个上限,并且SNN在较低发放频率下的信息处理能力尚未得到充分研究。在本文中,我们提出了两种基于脉冲时间的稀疏发放(SSR)正则化方法,以进一步降低基于TTFS编码的SNN的发放频率。这两种方法的特点是它们只需要关于发放时间和相关权重的信息。使用多层感知器网络和卷积神经网络结构,在MNIST、Fashion-MNIST和CIFAR-10数据集上研究了这些正则化方法的效果。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/87126135a158/41598_2023_50201_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/bfac68fcfbd3/41598_2023_50201_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/07fb48d870a2/41598_2023_50201_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/e103b94dfa82/41598_2023_50201_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/d7e62f1a29ec/41598_2023_50201_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/2d62a800d021/41598_2023_50201_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/777734b9d564/41598_2023_50201_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/87126135a158/41598_2023_50201_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/bfac68fcfbd3/41598_2023_50201_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/07fb48d870a2/41598_2023_50201_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/e103b94dfa82/41598_2023_50201_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/d7e62f1a29ec/41598_2023_50201_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/2d62a800d021/41598_2023_50201_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/777734b9d564/41598_2023_50201_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c036/10739753/87126135a158/41598_2023_50201_Fig7_HTML.jpg

相似文献

1
Sparse-firing regularization methods for spiking neural networks with time-to-first-spike coding.用于具有首次放电时间编码的脉冲神经网络的稀疏放电正则化方法。
Sci Rep. 2023 Dec 21;13(1):22897. doi: 10.1038/s41598-023-50201-5.
2
A TTFS-based energy and utilization efficient neuromorphic CNN accelerator.一种基于时间到第一个尖峰(TTFS)的能量与利用率高效的神经形态卷积神经网络加速器。
Front Neurosci. 2023 May 5;17:1121592. doi: 10.3389/fnins.2023.1121592. eCollection 2023.
3
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.实现基于尖峰的反向传播以训练深度神经网络架构。
Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020.
4
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
5
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
6
First-spike coding promotes accurate and efficient spiking neural networks for discrete events with rich temporal structures.首次尖峰编码促进了用于具有丰富时间结构的离散事件的精确且高效的尖峰神经网络。
Front Neurosci. 2023 Oct 2;17:1266003. doi: 10.3389/fnins.2023.1266003. eCollection 2023.
7
Rethinking skip connections in Spiking Neural Networks with Time-To-First-Spike coding.基于首次放电时间编码对脉冲神经网络中的跳跃连接进行重新思考。
Front Neurosci. 2024 Feb 14;18:1346805. doi: 10.3389/fnins.2024.1346805. eCollection 2024.
8
Temporal Backpropagation for Spiking Neural Networks with One Spike per Neuron.神经元每次仅发放一个尖峰的尖峰神经网络的时间反向传播。
Int J Neural Syst. 2020 Jun;30(6):2050027. doi: 10.1142/S0129065720500276. Epub 2020 May 28.
9
SPIDE: A purely spike-based method for training feedback spiking neural networks.SPIDE:一种用于训练反馈脉冲神经网络的纯基于脉冲的方法。
Neural Netw. 2023 Apr;161:9-24. doi: 10.1016/j.neunet.2023.01.026. Epub 2023 Jan 24.
10
Training Deep Spiking Neural Networks Using Backpropagation.使用反向传播训练深度脉冲神经网络。
Front Neurosci. 2016 Nov 8;10:508. doi: 10.3389/fnins.2016.00508. eCollection 2016.

引用本文的文献

1
An artificial visual neuron with multiplexed rate and time-to-first-spike coding.一种具有多路率和首次峰时编码的人工视觉神经元。
Nat Commun. 2024 May 1;15(1):3689. doi: 10.1038/s41467-024-48103-9.

本文引用的文献

1
Backpropagation-Based Learning Techniques for Deep Spiking Neural Networks: A Survey.基于反向传播的深度学习尖峰神经网络学习技术综述。
IEEE Trans Neural Netw Learn Syst. 2024 Sep;35(9):11906-11921. doi: 10.1109/TNNLS.2023.3263008. Epub 2024 Sep 3.
2
Analyzing time-to-first-spike coding schemes: A theoretical approach.分析首次尖峰时间编码方案:一种理论方法。
Front Neurosci. 2022 Sep 26;16:971937. doi: 10.3389/fnins.2022.971937. eCollection 2022.
3
Backpropagation With Sparsity Regularization for Spiking Neural Network Learning.
用于脉冲神经网络学习的带稀疏正则化的反向传播
Front Neurosci. 2022 Apr 14;16:760298. doi: 10.3389/fnins.2022.760298. eCollection 2022.
4
A Synaptic Pruning-Based Spiking Neural Network for Hand-Written Digits Classification.一种基于突触修剪的用于手写数字分类的脉冲神经网络。
Front Artif Intell. 2022 Feb 24;5:680165. doi: 10.3389/frai.2022.680165. eCollection 2022.
5
Surrogate gradients for analog neuromorphic computing.模拟神经形态计算的替代梯度。
Proc Natl Acad Sci U S A. 2022 Jan 25;119(4). doi: 10.1073/pnas.2109194119.
6
Revisiting Batch Normalization for Training Low-Latency Deep Spiking Neural Networks From Scratch.从头开始训练低延迟深度脉冲神经网络时重新审视批量归一化
Front Neurosci. 2021 Dec 9;15:773954. doi: 10.3389/fnins.2021.773954. eCollection 2021.
7
Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks.深度脉冲神经网络中用于反向传播的整流线性突触后电位函数
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):1947-1958. doi: 10.1109/TNNLS.2021.3110991. Epub 2022 May 2.
8
A Supervised Learning Algorithm for Multilayer Spiking Neural Networks Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design.一种基于时间编码的多层脉冲神经网络的监督学习算法,用于面向高能效超大规模集成电路处理器设计
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):394-408. doi: 10.1109/TNNLS.2021.3095068. Epub 2023 Jan 5.
9
Event-based backpropagation can compute exact gradients for spiking neural networks.基于事件的反向传播可以为脉冲神经网络计算精确的梯度。
Sci Rep. 2021 Jun 18;11(1):12829. doi: 10.1038/s41598-021-91786-z.
10
Temporal Coding in Spiking Neural Networks With Alpha Synaptic Function: Learning With Backpropagation.尖峰神经网络中的时间编码与阿尔法突触功能:基于反向传播的学习。
IEEE Trans Neural Netw Learn Syst. 2022 Oct;33(10):5939-5952. doi: 10.1109/TNNLS.2021.3071976. Epub 2022 Oct 5.