• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

在脉冲神经网络中学习长序列。

Learning long sequences in spiking neural networks.

作者信息

Stan Matei-Ioan, Rhodes Oliver

机构信息

Department of Computer Science, The University of Manchester, Manchester, UK.

出版信息

Sci Rep. 2024 Sep 20;14(1):21957. doi: 10.1038/s41598-024-71678-8.

DOI:10.1038/s41598-024-71678-8
PMID:39304663
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11415486/
Abstract

Spiking neural networks (SNNs) take inspiration from the brain to enable energy-efficient computations. Since the advent of Transformers, SNNs have struggled to compete with artificial networks on modern sequential tasks, as they inherit limitations from recurrent neural networks (RNNs), with the added challenge of training with non-differentiable binary spiking activations. However, a recent renewed interest in efficient alternatives to Transformers has given rise to state-of-the-art recurrent architectures named state space models (SSMs). This work systematically investigates, for the first time, the intersection of state-of-the-art SSMs with SNNs for long-range sequence modelling. Results suggest that SSM-based SNNs can outperform the Transformer on all tasks of a well-established long-range sequence modelling benchmark. It is also shown that SSM-based SNNs can outperform current state-of-the-art SNNs with fewer parameters on sequential image classification. Finally, a novel feature mixing layer is introduced, improving SNN accuracy while challenging assumptions about the role of binary activations in SNNs. This work paves the way for deploying powerful SSM-based architectures, such as large language models, to neuromorphic hardware for energy-efficient long-range sequence modelling.

摘要

脉冲神经网络(SNN)从大脑中汲取灵感,以实现高效节能的计算。自Transformer问世以来,SNN在现代序列任务中难以与人工网络竞争,因为它们继承了循环神经网络(RNN)的局限性,并且在使用不可微的二进制脉冲激活进行训练时面临额外挑战。然而,最近对Transformer的高效替代方案的重新关注催生了名为状态空间模型(SSM)的先进循环架构。这项工作首次系统地研究了用于长程序列建模的先进SSM与SNN的交集。结果表明,基于SSM的SNN在一个成熟的长程序列建模基准的所有任务上都能超越Transformer。研究还表明,基于SSM的SNN在序列图像分类中可以用更少的参数超越当前最先进的SNN。最后,引入了一种新颖的特征混合层,提高了SNN的准确性,同时对关于二进制激活在SNN中的作用的假设提出了挑战。这项工作为将强大的基于SSM的架构(如大语言模型)部署到神经形态硬件上以进行高效节能的长程序列建模铺平了道路。

相似文献

1
Learning long sequences in spiking neural networks.在脉冲神经网络中学习长序列。
Sci Rep. 2024 Sep 20;14(1):21957. doi: 10.1038/s41598-024-71678-8.
2
SpQuant-SNN: ultra-low precision membrane potential with sparse activations unlock the potential of on-device spiking neural networks applications.SpQuant-SNN:具有稀疏激活的超低精度膜电位开启了片上脉冲神经网络应用的潜力。
Front Neurosci. 2024 Sep 4;18:1440000. doi: 10.3389/fnins.2024.1440000. eCollection 2024.
3
A TTFS-based energy and utilization efficient neuromorphic CNN accelerator.一种基于时间到第一个尖峰(TTFS)的能量与利用率高效的神经形态卷积神经网络加速器。
Front Neurosci. 2023 May 5;17:1121592. doi: 10.3389/fnins.2023.1121592. eCollection 2023.
4
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
5
Efficient Processing of Spatio-Temporal Data Streams With Spiking Neural Networks.基于脉冲神经网络的时空数据流高效处理
Front Neurosci. 2020 May 5;14:439. doi: 10.3389/fnins.2020.00439. eCollection 2020.
6
Neuromorphic Sentiment Analysis Using Spiking Neural Networks.基于尖峰神经网络的神经形态情绪分析。
Sensors (Basel). 2023 Sep 6;23(18):7701. doi: 10.3390/s23187701.
7
Deep Learning With Spiking Neurons: Opportunities and Challenges.基于脉冲神经元的深度学习:机遇与挑战。
Front Neurosci. 2018 Oct 25;12:774. doi: 10.3389/fnins.2018.00774. eCollection 2018.
8
Locally connected spiking neural networks for unsupervised feature learning.用于无监督特征学习的局部连接脉冲神经网络。
Neural Netw. 2019 Nov;119:332-340. doi: 10.1016/j.neunet.2019.08.016. Epub 2019 Aug 26.
9
Direct training high-performance deep spiking neural networks: a review of theories and methods.直接训练高性能深度脉冲神经网络:理论与方法综述
Front Neurosci. 2024 Jul 31;18:1383844. doi: 10.3389/fnins.2024.1383844. eCollection 2024.
10
Rethinking the performance comparison between SNNS and ANNS.重新思考 SNNS 和 ANNS 的性能比较。
Neural Netw. 2020 Jan;121:294-307. doi: 10.1016/j.neunet.2019.09.005. Epub 2019 Sep 19.

引用本文的文献

1
The road to commercial success for neuromorphic technologies.神经形态技术的商业成功之路。
Nat Commun. 2025 Apr 15;16(1):3586. doi: 10.1038/s41467-025-57352-1.

本文引用的文献

1
Effective Surrogate Gradient Learning With High-Order Information Bottleneck for Spike-Based Machine Intelligence.基于尖峰的机器智能的具有高阶信息瓶颈的有效替代梯度学习
IEEE Trans Neural Netw Learn Syst. 2025 Jan;36(1):1734-1748. doi: 10.1109/TNNLS.2023.3329525. Epub 2025 Jan 7.
2
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
3
Nyströmformer: A Nystöm-based Algorithm for Approximating Self-Attention.
奈斯特罗姆变换器:一种基于奈斯特罗姆方法的自注意力近似算法。
Proc AAAI Conf Artif Intell. 2021;35(16):14138-14148. Epub 2021 May 18.
4
Dynamic Spatiotemporal Pattern Recognition With Recurrent Spiking Neural Network.基于循环尖峰神经网络的动态时空模式识别。
Neural Comput. 2021 Oct 12;33(11):2971-2995. doi: 10.1162/neco_a_01432.
5
Comparison of Artificial and Spiking Neural Networks on Digital Hardware.数字硬件上人工神经网络与脉冲神经网络的比较
Front Neurosci. 2021 Apr 6;15:651141. doi: 10.3389/fnins.2021.651141. eCollection 2021.
6
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
7
Memory in linear recurrent neural networks in continuous time.连续时间线性递归神经网络中的记忆。
Neural Netw. 2010 Apr;23(3):341-55. doi: 10.1016/j.neunet.2009.08.008. Epub 2009 Aug 31.
8
Long short-term memory.长短期记忆
Neural Comput. 1997 Nov 15;9(8):1735-80. doi: 10.1162/neco.1997.9.8.1735.
9
Why there are complementary learning systems in the hippocampus and neocortex: insights from the successes and failures of connectionist models of learning and memory.为何海马体和新皮层中存在互补学习系统:来自学习与记忆联结主义模型成败的启示。
Psychol Rev. 1995 Jul;102(3):419-457. doi: 10.1037/0033-295X.102.3.419.