• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

TCJA-SNN:脉冲神经网络的时间-通道联合注意力机制

TCJA-SNN: Temporal-Channel Joint Attention for Spiking Neural Networks.

作者信息

Zhu Rui-Jie, Zhang Malu, Zhao Qihang, Deng Haoyu, Duan Yule, Deng Liang-Jian

出版信息

IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):5112-5125. doi: 10.1109/TNNLS.2024.3377717. Epub 2025 Feb 28.

DOI:10.1109/TNNLS.2024.3377717
PMID:38598397
Abstract

Spiking neural networks (SNNs) are attracting widespread interest due to their biological plausibility, energy efficiency, and powerful spatiotemporal information representation ability. Given the critical role of attention mechanisms in enhancing neural network performance, the integration of SNNs and attention mechanisms exhibits tremendous potential to deliver energy-efficient and high-performance computing paradigms. In this article, we present a novel temporal-channel joint attention mechanism for SNNs, referred to as TCJA-SNN. The proposed TCJA-SNN framework can effectively assess the significance of spike sequence from both spatial and temporal dimensions. More specifically, our essential technical contribution lies on: 1) we employ the squeeze operation to compress the spike stream into an average matrix. Then, we leverage two local attention mechanisms based on efficient 1-D convolutions to facilitate comprehensive feature extraction at the temporal and channel levels independently and 2) we introduce the cross-convolutional fusion (CCF) layer as a novel approach to model the interdependencies between the temporal and channel scopes. This layer effectively breaks the independence of these two dimensions and enables the interaction between features. Experimental results demonstrate that the proposed TCJA-SNN outperforms the state-of-the-art (SOTA) on all standard static and neuromorphic datasets, including Fashion-MNIST, CIFAR10, CIFAR100, CIFAR10-DVS, N-Caltech 101, and DVS128 Gesture. Furthermore, we effectively apply the TCJA-SNN framework to image generation tasks by leveraging a variation autoencoder. To the best of our knowledge, this study is the first instance where the SNN-attention mechanism has been employed for high-level classification and low-level generation tasks. Our implementation codes are available at https://github.com/ridgerchu/TCJA.

摘要

脉冲神经网络(SNN)因其生物学合理性、能源效率和强大的时空信息表示能力而受到广泛关注。鉴于注意力机制在提升神经网络性能方面的关键作用,SNN与注意力机制的融合展现出了提供节能且高性能计算范式的巨大潜力。在本文中,我们提出了一种用于SNN的新颖的时间-通道联合注意力机制,称为TCJA-SNN。所提出的TCJA-SNN框架能够有效地从空间和时间维度评估脉冲序列的重要性。更具体地说,我们的主要技术贡献在于:1)我们采用挤压操作将脉冲流压缩成一个平均矩阵。然后,我们利用基于高效一维卷积的两种局部注意力机制,分别在时间和通道层面促进全面的特征提取;2)我们引入交叉卷积融合(CCF)层作为一种新颖的方法来建模时间和通道范围之间的相互依赖关系。这一层有效地打破了这两个维度的独立性,并实现了特征之间的交互。实验结果表明,所提出的TCJA-SNN在所有标准静态和神经形态数据集上均优于当前最优方法(SOTA),包括Fashion-MNIST、CIFAR10、CIFAR100、CIFAR10-DVS、N-Caltech 101和DVS128 Gesture。此外,我们通过利用变分自编码器有效地将TCJA-SNN框架应用于图像生成任务。据我们所知,本研究是首次将SNN-注意力机制用于高级分类和低级生成任务。我们的实现代码可在https://github.com/ridgerchu/TCJA获取。

相似文献

1
TCJA-SNN: Temporal-Channel Joint Attention for Spiking Neural Networks.TCJA-SNN:脉冲神经网络的时间-通道联合注意力机制
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):5112-5125. doi: 10.1109/TNNLS.2024.3377717. Epub 2025 Feb 28.
2
STCA-SNN: self-attention-based temporal-channel joint attention for spiking neural networks.STCA-SNN:用于脉冲神经网络的基于自注意力的时间-通道联合注意力
Front Neurosci. 2023 Nov 10;17:1261543. doi: 10.3389/fnins.2023.1261543. eCollection 2023.
3
STSC-SNN: Spatio-Temporal Synaptic Connection with temporal convolution and attention for spiking neural networks.STSC-SNN:用于脉冲神经网络的具有时间卷积和注意力机制的时空突触连接
Front Neurosci. 2022 Dec 23;16:1079357. doi: 10.3389/fnins.2022.1079357. eCollection 2022.
4
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing.深度尖峰神经网络在动态视觉传感中的优化。
Neural Netw. 2021 Dec;144:686-698. doi: 10.1016/j.neunet.2021.09.022. Epub 2021 Oct 5.
5
Towards parameter-free attentional spiking neural networks.迈向无参数注意力脉冲神经网络。
Neural Netw. 2025 May;185:107154. doi: 10.1016/j.neunet.2025.107154. Epub 2025 Jan 16.
6
SpQuant-SNN: ultra-low precision membrane potential with sparse activations unlock the potential of on-device spiking neural networks applications.SpQuant-SNN:具有稀疏激活的超低精度膜电位开启了片上脉冲神经网络应用的潜力。
Front Neurosci. 2024 Sep 4;18:1440000. doi: 10.3389/fnins.2024.1440000. eCollection 2024.
7
A Spatial-Channel-Temporal-Fused Attention for Spiking Neural Networks.一种用于尖峰神经网络的空间-通道-时间融合注意力机制。
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):14315-14329. doi: 10.1109/TNNLS.2023.3278265. Epub 2024 Oct 7.
8
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
9
SGLFormer: Spiking Global-Local-Fusion Transformer with high performance.SGLFormer:具有高性能的脉冲全局-局部融合变压器。
Front Neurosci. 2024 Mar 12;18:1371290. doi: 10.3389/fnins.2024.1371290. eCollection 2024.
10
Sparser spiking activity can be better: Feature Refine-and-Mask spiking neural network for event-based visual recognition.稀疏尖峰活动可以更好:基于事件的视觉识别的特征细化和掩蔽尖峰神经网络。
Neural Netw. 2023 Sep;166:410-423. doi: 10.1016/j.neunet.2023.07.008. Epub 2023 Jul 20.

引用本文的文献

1
Balancing Energy Consumption and Detection Accuracy in Cardiovascular Disease Diagnosis: A Spiking Neural Network-Based Approach with ECG and PCG Signals.心血管疾病诊断中能量消耗与检测准确性的平衡:一种基于尖峰神经网络的心电图和心音图信号方法
Sensors (Basel). 2025 Aug 24;25(17):5263. doi: 10.3390/s25175263.
2
Accurate and efficient stock market index prediction: an integrated approach based on VMD-SNNs.准确高效的股票市场指数预测:一种基于变分模态分解-随机神经网络的综合方法
J Appl Stat. 2024 Sep 3;52(4):841-867. doi: 10.1080/02664763.2024.2395961. eCollection 2025.
3
Motion feature extraction using magnocellular-inspired spiking neural networks for drone detection.
使用受大细胞启发的脉冲神经网络进行无人机检测的运动特征提取
Front Comput Neurosci. 2025 Jan 22;19:1452203. doi: 10.3389/fncom.2025.1452203. eCollection 2025.
4
Sg-snn: a self-organizing spiking neural network based on temporal information.Sg-snn:一种基于时间信息的自组织脉冲神经网络。
Cogn Neurodyn. 2025 Dec;19(1):14. doi: 10.1007/s11571-024-10199-6. Epub 2025 Jan 9.
5
Memristive leaky integrate-and-fire neuron and learnable straight-through estimator in spiking neural networks.忆阻型漏电积分发放神经元与脉冲神经网络中的可学习直通估计器
Cogn Neurodyn. 2024 Oct;18(5):3075-3091. doi: 10.1007/s11571-024-10133-w. Epub 2024 Jun 20.
6
STCA-SNN: self-attention-based temporal-channel joint attention for spiking neural networks.STCA-SNN:用于脉冲神经网络的基于自注意力的时间-通道联合注意力
Front Neurosci. 2023 Nov 10;17:1261543. doi: 10.3389/fnins.2023.1261543. eCollection 2023.
7
SpikingJelly: An open-source machine learning infrastructure platform for spike-based intelligence.SpikingJelly:一个用于基于尖峰的智能的开源机器学习基础架构平台。
Sci Adv. 2023 Oct 6;9(40):eadi1480. doi: 10.1126/sciadv.adi1480.
8
Direct learning-based deep spiking neural networks: a review.基于直接学习的深度脉冲神经网络综述
Front Neurosci. 2023 Jun 16;17:1209795. doi: 10.3389/fnins.2023.1209795. eCollection 2023.
9
STSC-SNN: Spatio-Temporal Synaptic Connection with temporal convolution and attention for spiking neural networks.STSC-SNN:用于脉冲神经网络的具有时间卷积和注意力机制的时空突触连接
Front Neurosci. 2022 Dec 23;16:1079357. doi: 10.3389/fnins.2022.1079357. eCollection 2022.