Suppr超能文献

关注脉冲神经网络。

Attention Spiking Neural Networks.

作者信息

Yao Man, Zhao Guangshe, Zhang Hengyu, Hu Yifan, Deng Lei, Tian Yonghong, Xu Bo, Li Guoqi

出版信息

IEEE Trans Pattern Anal Mach Intell. 2023 Aug;45(8):9393-9410. doi: 10.1109/TPAMI.2023.3241201. Epub 2023 Jun 30.

Abstract

Brain-inspired spiking neural networks (SNNs) are becoming a promising energy-efficient alternative to traditional artificial neural networks (ANNs). However, the performance gap between SNNs and ANNs has been a significant hindrance to deploying SNNs ubiquitously. To leverage the full potential of SNNs, in this paper we study the attention mechanisms, which can help human focus on important information. We present our idea of attention in SNNs with a multi-dimensional attention module, which infers attention weights along the temporal, channel, as well as spatial dimension separately or simultaneously. Based on the existing neuroscience theories, we exploit the attention weights to optimize membrane potentials, which in turn regulate the spiking response. Extensive experimental results on event-based action recognition and image classification datasets demonstrate that attention facilitates vanilla SNNs to achieve sparser spiking firing, better performance, and energy efficiency concurrently. In particular, we achieve top-1 accuracy of 75.92% and 77.08% on ImageNet-1 K with single/4-step Res-SNN-104, which are state-of-the-art results in SNNs. Compared with counterpart Res-ANN-104, the performance gap becomes -0.95/+0.21 percent and the energy efficiency is 31.8×/7.4×. To analyze the effectiveness of attention SNNs, we theoretically prove that the spiking degradation or the gradient vanishing, which usually holds in general SNNs, can be resolved by introducing the block dynamical isometry theory. We also analyze the efficiency of attention SNNs based on our proposed spiking response visualization method. Our work lights up SNN's potential as a general backbone to support various applications in the field of SNN research, with a great balance between effectiveness and energy efficiency.

摘要

受大脑启发的脉冲神经网络(SNN)正成为传统人工神经网络(ANN)一种有前景的节能替代方案。然而,SNN与ANN之间的性能差距一直是广泛部署SNN的重大障碍。为了充分发挥SNN的潜力,在本文中我们研究了注意力机制,其可以帮助人类聚焦重要信息。我们通过一个多维注意力模块提出了我们在SNN中的注意力理念,该模块分别或同时沿时间、通道以及空间维度推断注意力权重。基于现有的神经科学理论,我们利用注意力权重来优化膜电位,进而调节脉冲响应。在基于事件的动作识别和图像分类数据集上的大量实验结果表明,注意力有助于普通SNN同时实现更稀疏的脉冲发放、更好的性能和能源效率。特别是,我们在ImageNet-1K上使用单步/4步Res-SNN-104分别达到了75.92%和77.08%的top-1准确率,这是SNN中的当前最优结果。与对应的Res-ANN-104相比,性能差距变为-0.95/+0.21个百分点,能源效率提高了31.8倍/7.4倍。为了分析注意力SNN的有效性,我们从理论上证明,通常在普通SNN中存在的脉冲退化或梯度消失问题,可以通过引入块动力学等距理论来解决。我们还基于我们提出的脉冲响应可视化方法分析了注意力SNN的效率。我们的工作揭示了SNN作为支持SNN研究领域各种应用的通用主干的潜力,在有效性和能源效率之间取得了很好的平衡。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验