• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于首次放电时间编码对脉冲神经网络中的跳跃连接进行重新思考。

Rethinking skip connections in Spiking Neural Networks with Time-To-First-Spike coding.

作者信息

Kim Youngeun, Kahana Adar, Yin Ruokai, Li Yuhang, Stinis Panos, Karniadakis George Em, Panda Priyadarshini

机构信息

Department of Electrical Engineering, Yale University, New Haven, CT, United States.

Division of Applied Mathematics, Brown University, Providence, RI, United States.

出版信息

Front Neurosci. 2024 Feb 14;18:1346805. doi: 10.3389/fnins.2024.1346805. eCollection 2024.

DOI:10.3389/fnins.2024.1346805
PMID:38419664
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10899405/
Abstract

Time-To-First-Spike (TTFS) coding in Spiking Neural Networks (SNNs) offers significant advantages in terms of energy efficiency, closely mimicking the behavior of biological neurons. In this work, we delve into the role of skip connections, a widely used concept in Artificial Neural Networks (ANNs), within the domain of SNNs with TTFS coding. Our focus is on two distinct types of skip connection architectures: (1) addition-based skip connections, and (2) concatenation-based skip connections. We find that addition-based skip connections introduce an additional delay in terms of spike timing. On the other hand, concatenation-based skip connections circumvent this delay but produce time gaps between after-convolution and skip connection paths, thereby restricting the effective mixing of information from these two paths. To mitigate these issues, we propose a novel approach involving a learnable delay for skip connections in the concatenation-based skip connection architecture. This approach successfully bridges the time gap between the convolutional and skip branches, facilitating improved information mixing. We conduct experiments on public datasets including MNIST and Fashion-MNIST, illustrating the advantage of the skip connection in TTFS coding architectures. Additionally, we demonstrate the applicability of TTFS coding on beyond image recognition tasks and extend it to scientific machine-learning tasks, broadening the potential uses of SNNs.

摘要

脉冲神经网络(SNNs)中的首次脉冲时间(TTFS)编码在能量效率方面具有显著优势,它紧密模仿生物神经元的行为。在这项工作中,我们深入研究了跳跃连接(人工神经网络(ANNs)中广泛使用的概念)在具有TTFS编码的SNNs领域中的作用。我们关注两种不同类型的跳跃连接架构:(1)基于加法的跳跃连接,以及(2)基于拼接的跳跃连接。我们发现基于加法的跳跃连接在脉冲定时方面引入了额外的延迟。另一方面,基于拼接的跳跃连接规避了这种延迟,但在卷积后路径和跳跃连接路径之间产生了时间间隙,从而限制了这两条路径信息的有效混合。为了缓解这些问题,我们提出了一种新颖的方法,即在基于拼接的跳跃连接架构中为跳跃连接引入可学习的延迟。这种方法成功地弥合了卷积分支和跳跃分支之间的时间间隙,促进了更好的信息混合。我们在包括MNIST和Fashion-MNIST在内的公共数据集上进行了实验,展示了跳跃连接在TTFS编码架构中的优势。此外,我们证明了TTFS编码在图像识别任务之外的适用性,并将其扩展到科学机器学习任务,拓宽了SNNs的潜在用途。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/16ee1da14eec/fnins-18-1346805-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/392805b88dad/fnins-18-1346805-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/47d17b412867/fnins-18-1346805-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/6aa719ee0eb2/fnins-18-1346805-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/1eea4b546ba9/fnins-18-1346805-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/180a8feac15a/fnins-18-1346805-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/28ef85b80391/fnins-18-1346805-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/24bc2b6d6001/fnins-18-1346805-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/51c94494f131/fnins-18-1346805-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/16ee1da14eec/fnins-18-1346805-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/392805b88dad/fnins-18-1346805-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/47d17b412867/fnins-18-1346805-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/6aa719ee0eb2/fnins-18-1346805-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/1eea4b546ba9/fnins-18-1346805-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/180a8feac15a/fnins-18-1346805-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/28ef85b80391/fnins-18-1346805-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/24bc2b6d6001/fnins-18-1346805-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/51c94494f131/fnins-18-1346805-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f124/10899405/16ee1da14eec/fnins-18-1346805-g0009.jpg

相似文献

1
Rethinking skip connections in Spiking Neural Networks with Time-To-First-Spike coding.基于首次放电时间编码对脉冲神经网络中的跳跃连接进行重新思考。
Front Neurosci. 2024 Feb 14;18:1346805. doi: 10.3389/fnins.2024.1346805. eCollection 2024.
2
A TTFS-based energy and utilization efficient neuromorphic CNN accelerator.一种基于时间到第一个尖峰(TTFS)的能量与利用率高效的神经形态卷积神经网络加速器。
Front Neurosci. 2023 May 5;17:1121592. doi: 10.3389/fnins.2023.1121592. eCollection 2023.
3
Learnable axonal delay in spiking neural networks improves spoken word recognition.脉冲神经网络中可学习的轴突延迟改善了口语单词识别。
Front Neurosci. 2023 Nov 9;17:1275944. doi: 10.3389/fnins.2023.1275944. eCollection 2023.
4
Sparse-firing regularization methods for spiking neural networks with time-to-first-spike coding.用于具有首次放电时间编码的脉冲神经网络的稀疏放电正则化方法。
Sci Rep. 2023 Dec 21;13(1):22897. doi: 10.1038/s41598-023-50201-5.
5
Performance improvement of weakly supervised fully convolutional networks by skip connections for brain structure segmentation.基于 skip connections 的弱监督全卷积网络在脑结构分割中的性能提升。
Med Phys. 2021 Nov;48(11):7215-7227. doi: 10.1002/mp.15192. Epub 2021 Sep 13.
6
First-spike coding promotes accurate and efficient spiking neural networks for discrete events with rich temporal structures.首次尖峰编码促进了用于具有丰富时间结构的离散事件的精确且高效的尖峰神经网络。
Front Neurosci. 2023 Oct 2;17:1266003. doi: 10.3389/fnins.2023.1266003. eCollection 2023.
7
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.实现基于尖峰的反向传播以训练深度神经网络架构。
Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020.
8
Efficient Processing of Spatio-Temporal Data Streams With Spiking Neural Networks.基于脉冲神经网络的时空数据流高效处理
Front Neurosci. 2020 May 5;14:439. doi: 10.3389/fnins.2020.00439. eCollection 2020.
9
Delay learning based on temporal coding in Spiking Neural Networks.基于尖峰神经网络的时间编码的延迟学习。
Neural Netw. 2024 Dec;180:106678. doi: 10.1016/j.neunet.2024.106678. Epub 2024 Aug 31.
10
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.

本文引用的文献

1
Effective Surrogate Gradient Learning With High-Order Information Bottleneck for Spike-Based Machine Intelligence.基于尖峰的机器智能的具有高阶信息瓶颈的有效替代梯度学习
IEEE Trans Neural Netw Learn Syst. 2025 Jan;36(1):1734-1748. doi: 10.1109/TNNLS.2023.3329525. Epub 2025 Jan 7.
2
A survey of sound source localization with deep learning methods.基于深度学习方法的声源定位研究
J Acoust Soc Am. 2022 Jul;152(1):107. doi: 10.1121/10.0011809.
3
Rethinking the Role of Normalization and Residual Blocks for Spiking Neural Networks.
重新思考标准化和残差块在尖峰神经网络中的作用。
Sensors (Basel). 2022 Apr 8;22(8):2876. doi: 10.3390/s22082876.
4
Spiking Deep Residual Networks.尖峰深度残差网络
IEEE Trans Neural Netw Learn Syst. 2023 Aug;34(8):5200-5205. doi: 10.1109/TNNLS.2021.3119238. Epub 2023 Aug 4.
5
Progressive Tandem Learning for Pattern Recognition With Deep Spiking Neural Networks.深度尖峰神经网络中用于模式识别的渐进式串联学习。
IEEE Trans Pattern Anal Mach Intell. 2022 Nov;44(11):7824-7840. doi: 10.1109/TPAMI.2021.3114196. Epub 2022 Oct 4.
6
Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks.深度脉冲神经网络中用于反向传播的整流线性突触后电位函数
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):1947-1958. doi: 10.1109/TNNLS.2021.3110991. Epub 2022 May 2.
7
A Supervised Learning Algorithm for Multilayer Spiking Neural Networks Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design.一种基于时间编码的多层脉冲神经网络的监督学习算法,用于面向高能效超大规模集成电路处理器设计
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):394-408. doi: 10.1109/TNNLS.2021.3095068. Epub 2023 Jan 5.
8
Neural Coding in Spiking Neural Networks: A Comparative Study for Robust Neuromorphic Systems.脉冲神经网络中的神经编码:对鲁棒神经形态系统的比较研究
Front Neurosci. 2021 Mar 4;15:638474. doi: 10.3389/fnins.2021.638474. eCollection 2021.
9
Toward Scalable, Efficient, and Accurate Deep Spiking Neural Networks With Backward Residual Connections, Stochastic Softmax, and Hybridization.迈向具有反向残差连接、随机softmax和混合化的可扩展、高效且准确的深度脉冲神经网络。
Front Neurosci. 2020 Jun 30;14:653. doi: 10.3389/fnins.2020.00653. eCollection 2020.
10
Temporal Backpropagation for Spiking Neural Networks with One Spike per Neuron.神经元每次仅发放一个尖峰的尖峰神经网络的时间反向传播。
Int J Neural Syst. 2020 Jun;30(6):2050027. doi: 10.1142/S0129065720500276. Epub 2020 May 28.