• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过神经结构搜索进行高效的尖峰神经网络设计。

Efficient spiking neural network design via neural architecture search.

机构信息

Zhejiang University, Hangzhou, 310027, China.

National University of Singapore, 119077, Singapore.

出版信息

Neural Netw. 2024 May;173:106172. doi: 10.1016/j.neunet.2024.106172. Epub 2024 Feb 16.

DOI:10.1016/j.neunet.2024.106172
PMID:38402808
Abstract

Spiking neural networks (SNNs) are brain-inspired models that utilize discrete and sparse spikes to transmit information, thus having the property of energy efficiency. Recent advances in learning algorithms have greatly improved SNN performance due to the automation of feature engineering. While the choice of neural architecture plays a significant role in deep learning, the current SNN architectures are mainly designed manually, which is a time-consuming and error-prone process. In this paper, we propose a spiking neural architecture search (NAS) method that can automatically find efficient SNNs. To tackle the challenge of long search time faced by SNNs when utilizing NAS, the proposed NAS encodes candidate architectures in a branchless spiking supernet which significantly reduces the computation requirements in the search process. Considering that real-world tasks prefer efficient networks with optimal accuracy under a limited computational budget, we propose a Synaptic Operation (SynOps)-aware optimization to automatically find the computationally efficient subspace of the supernet. Experimental results show that, in less search time, our proposed NAS can find SNNs with higher accuracy and lower computational cost than state-of-the-art SNNs. We also conduct experiments to validate the search process and the trade-off between accuracy and computational cost.

摘要

尖峰神经网络(SNN)是一种受大脑启发的模型,利用离散和稀疏的尖峰来传输信息,因此具有节能的特性。由于特征工程的自动化,学习算法的最新进展极大地提高了 SNN 的性能。虽然神经架构的选择在深度学习中起着重要的作用,但目前的 SNN 架构主要是手动设计的,这是一个耗时且容易出错的过程。在本文中,我们提出了一种尖峰神经网络搜索(NAS)方法,可以自动找到高效的 SNN。为了解决 SNN 在利用 NAS 时面临的搜索时间长的挑战,所提出的 NAS 以无分支尖峰超网的形式对候选架构进行编码,这大大减少了搜索过程中的计算需求。考虑到现实世界的任务更喜欢在有限的计算预算下具有最佳准确性的高效网络,我们提出了一种 Synaptic Operation(SynOps)感知优化,以自动找到超网的计算高效子空间。实验结果表明,在较短的搜索时间内,我们提出的 NAS 可以找到比最先进的 SNN 具有更高准确性和更低计算成本的 SNN。我们还进行了实验来验证搜索过程以及准确性和计算成本之间的权衡。

相似文献

1
Efficient spiking neural network design via neural architecture search.通过神经结构搜索进行高效的尖峰神经网络设计。
Neural Netw. 2024 May;173:106172. doi: 10.1016/j.neunet.2024.106172. Epub 2024 Feb 16.
2
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
3
HybridSNN: Combining Bio-Machine Strengths by Boosting Adaptive Spiking Neural Networks.HybridSNN:通过提升自适应尖峰神经网络来结合生物机器的优势。
IEEE Trans Neural Netw Learn Syst. 2023 Sep;34(9):5841-5855. doi: 10.1109/TNNLS.2021.3131356. Epub 2023 Sep 1.
4
Sampling complex topology structures for spiking neural networks. Spike 神经网络的复杂拓扑结构采样。
Neural Netw. 2024 Apr;172:106121. doi: 10.1016/j.neunet.2024.106121. Epub 2024 Jan 10.
5
Rethinking the performance comparison between SNNS and ANNS.重新思考 SNNS 和 ANNS 的性能比较。
Neural Netw. 2020 Jan;121:294-307. doi: 10.1016/j.neunet.2019.09.005. Epub 2019 Sep 19.
6
Optimizing Deeper Spiking Neural Networks for Dynamic Vision Sensing.深度尖峰神经网络在动态视觉传感中的优化。
Neural Netw. 2021 Dec;144:686-698. doi: 10.1016/j.neunet.2021.09.022. Epub 2021 Oct 5.
7
Neuromorphic Sentiment Analysis Using Spiking Neural Networks.基于尖峰神经网络的神经形态情绪分析。
Sensors (Basel). 2023 Sep 6;23(18):7701. doi: 10.3390/s23187701.
8
SpQuant-SNN: ultra-low precision membrane potential with sparse activations unlock the potential of on-device spiking neural networks applications.SpQuant-SNN:具有稀疏激活的超低精度膜电位开启了片上脉冲神经网络应用的潜力。
Front Neurosci. 2024 Sep 4;18:1440000. doi: 10.3389/fnins.2024.1440000. eCollection 2024.
9
One-Shot Neural Architecture Search by Dynamically Pruning Supernet in Hierarchical Order.分层动态剪枝超网的单步神经架构搜索。
Int J Neural Syst. 2021 Jul;31(7):2150029. doi: 10.1142/S0129065721500295. Epub 2021 Jun 14.
10
Advancements in Algorithms and Neuromorphic Hardware for Spiking Neural Networks. Spike 神经网络算法和神经形态硬件的进展。
Neural Comput. 2022 May 19;34(6):1289-1328. doi: 10.1162/neco_a_01499.

引用本文的文献

1
Auto Deep Spiking Neural Network Design Based on an Evolutionary Membrane Algorithm.基于进化膜算法的自动深度脉冲神经网络设计
Biomimetics (Basel). 2025 Aug 6;10(8):514. doi: 10.3390/biomimetics10080514.