Suppr超能文献

利用混合自上而下注意力增强脉冲神经网络。

Enhancing spiking neural networks with hybrid top-down attention.

作者信息

Liu Faqiang, Zhao Rong

机构信息

Department of Precision Instrument, Center for Brain-Inspired Computing Research, Beijing Advanced Innovation Center for Integrated Circuits, Tsinghua University, Beijing, China.

出版信息

Front Neurosci. 2022 Aug 22;16:949142. doi: 10.3389/fnins.2022.949142. eCollection 2022.

Abstract

As the representatives of brain-inspired models at the neuronal level, spiking neural networks (SNNs) have shown great promise in processing spatiotemporal information with intrinsic temporal dynamics. SNNs are expected to further improve their robustness and computing efficiency by introducing top-down attention at the architectural level, which is crucial for the human brain to support advanced intelligence. However, this attempt encounters difficulties in optimizing the attention in SNNs largely due to the lack of annotations. Here, we develop a hybrid network model with a top-down attention mechanism (HTDA) by incorporating an artificial neural network (ANN) to generate attention maps based on the features extracted by a feedforward SNN. The attention map is then used to modulate the encoding layer of the SNN so that it focuses on the most informative sensory input. To facilitate direct learning of attention maps and avoid labor-intensive annotations, we propose a general principle and a corresponding weakly-supervised objective, which promotes the HTDA model to utilize an integral and small subset of the input to give accurate predictions. On this basis, the ANN and the SNN can be jointly optimized by surrogate gradient descent in an end-to-end manner. We comprehensively evaluated the HTDA model on object recognition tasks, which demonstrates strong robustness to adversarial noise, high computing efficiency, and good interpretability. On the widely-adopted CIFAR-10, CIFAR-100, and MNIST benchmarks, the HTDA model reduces firing rates by up to 50% and improves adversarial robustness by up to 10% with comparable or better accuracy compared with the state-of-the-art SNNs. The HTDA model is also verified on dynamic neuromorphic datasets and achieves consistent improvements. This study provides a new way to boost the performance of SNNs by employing a hybrid top-down attention mechanism.

摘要

作为神经元层面受大脑启发模型的代表,脉冲神经网络(SNN)在利用内在时间动态处理时空信息方面展现出了巨大潜力。通过在架构层面引入自上而下的注意力,SNN有望进一步提高其鲁棒性和计算效率,这对于人类大脑支持高级智能至关重要。然而,由于缺乏标注,这种尝试在优化SNN中的注意力时遇到了困难。在此,我们通过结合人工神经网络(ANN)来开发一种具有自上而下注意力机制的混合网络模型(HTDA),该人工神经网络基于前馈SNN提取的特征生成注意力图。然后,注意力图用于调制SNN的编码层,使其专注于信息最丰富的感官输入。为了便于直接学习注意力图并避免劳动密集型的标注,我们提出了一个通用原则和相应的弱监督目标,这促使HTDA模型利用输入的一个完整且小的子集来做出准确预测。在此基础上,ANN和SNN可以通过替代梯度下降以端到端的方式进行联合优化。我们在目标识别任务上全面评估了HTDA模型,结果表明该模型对对抗噪声具有很强的鲁棒性、计算效率高且具有良好的可解释性。在广泛采用的CIFAR - 10、CIFAR - 100和MNIST基准测试中,与最先进的SNN相比,HTDA模型的激发率降低了高达50%,对抗鲁棒性提高了高达10%,同时具有相当或更好的准确率。HTDA模型在动态神经形态数据集上也得到了验证,并取得了一致的改进。这项研究通过采用混合自上而下的注意力机制为提升SNN的性能提供了一种新方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/541d/9443487/d74be86c1f87/fnins-16-949142-g0001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验