• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于脑启发深度 SNN 的声纳目标分类的尖峰近似反向传播算法。

Spike-Based Approximate Backpropagation Algorithm of Brain-Inspired Deep SNN for Sonar Target Classification.

机构信息

Henan Province Engineering Research Center of Spatial Information Processing, Kaifeng 475004, China.

College of Computer and Information Engineering, Henan University, Kaifeng 475004, China.

出版信息

Comput Intell Neurosci. 2022 Oct 20;2022:1633946. doi: 10.1155/2022/1633946. eCollection 2022.

DOI:10.1155/2022/1633946
PMID:36313052
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9613403/
Abstract

With the development of neuromorphic computing, more and more attention has been paid to a brain-inspired spiking neural network (SNN) because of its ultralow energy consumption and high-performance spatiotemporal information processing. Due to the discontinuity of the spiking neuronal activation function, it is still a difficult problem to train brain-inspired deep SNN directly, so SNN has not yet shown performance comparable to that of an artificial neural network. For this reason, the spike-based approximate backpropagation (SABP) algorithm and a general brain-inspired SNN framework are proposed in this paper. The combination of the two can be used for end-to-end direct training of brain-inspired deep SNN. Experiments show that compared with other spike-based methods of directly training SNN, the classification accuracy of this method is close to the best results on MNIST and CIFAR-10 datasets and achieves the best classification accuracy on sonar image target classification (SITC) of small sample datasets. Further analysis shows that compared with artificial neural networks, our brain-inspired SNN has great advantages in computational complexity and energy consumption in sonar target classification.

摘要

随着神经形态计算的发展,由于其超低的能耗和高性能的时空信息处理能力,人们越来越关注受大脑启发的尖峰神经网络 (SNN)。由于尖峰神经元激活函数的不连续性,直接训练受大脑启发的深度 SNN 仍然是一个难题,因此 SNN 尚未表现出与人工神经网络相当的性能。为此,本文提出了基于尖峰的近似反向传播 (SABP) 算法和通用的受大脑启发的 SNN 框架。这两者的结合可用于端到端直接训练受大脑启发的深度 SNN。实验表明,与其他直接训练 SNN 的基于尖峰的方法相比,该方法在 MNIST 和 CIFAR-10 数据集上的分类精度接近最佳结果,并在小样本数据集的声纳图像目标分类 (SITC) 上实现了最佳分类精度。进一步的分析表明,与人工神经网络相比,我们的受大脑启发的 SNN 在声纳目标分类的计算复杂度和能耗方面具有很大的优势。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/9e5b99d5070a/CIN2022-1633946.009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/d00b14c91adb/CIN2022-1633946.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/18c114506a9f/CIN2022-1633946.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/004ae18d455e/CIN2022-1633946.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/e2bf685538ac/CIN2022-1633946.004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/4db1f68705f4/CIN2022-1633946.005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/14a2f8d06cbd/CIN2022-1633946.006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/afc842ed442f/CIN2022-1633946.007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/50fc84992ce3/CIN2022-1633946.008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/9e5b99d5070a/CIN2022-1633946.009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/d00b14c91adb/CIN2022-1633946.001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/18c114506a9f/CIN2022-1633946.002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/004ae18d455e/CIN2022-1633946.003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/e2bf685538ac/CIN2022-1633946.004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/4db1f68705f4/CIN2022-1633946.005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/14a2f8d06cbd/CIN2022-1633946.006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/afc842ed442f/CIN2022-1633946.007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/50fc84992ce3/CIN2022-1633946.008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4280/9613403/9e5b99d5070a/CIN2022-1633946.009.jpg

相似文献

1
Spike-Based Approximate Backpropagation Algorithm of Brain-Inspired Deep SNN for Sonar Target Classification.基于脑启发深度 SNN 的声纳目标分类的尖峰近似反向传播算法。
Comput Intell Neurosci. 2022 Oct 20;2022:1633946. doi: 10.1155/2022/1633946. eCollection 2022.
2
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
3
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.实现基于尖峰的反向传播以训练深度神经网络架构。
Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020.
4
A Tandem Learning Rule for Effective Training and Rapid Inference of Deep Spiking Neural Networks.一种用于深度脉冲神经网络有效训练和快速推理的串联学习规则。
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):446-460. doi: 10.1109/TNNLS.2021.3095724. Epub 2023 Jan 5.
5
Exploring Optimized Spiking Neural Network Architectures for Classification Tasks on Embedded Platforms.探索用于嵌入式平台上分类任务的优化尖峰神经网络架构。
Sensors (Basel). 2021 May 7;21(9):3240. doi: 10.3390/s21093240.
6
A Parallel Spiking Neural Network Based on Adaptive Lateral Inhibition Mechanism for Objective Recognition.基于自适应侧抑制机制的用于目标识别的并行尖峰神经网络。
Comput Intell Neurosci. 2022 Oct 13;2022:4242235. doi: 10.1155/2022/4242235. eCollection 2022.
7
Rethinking the performance comparison between SNNS and ANNS.重新思考 SNNS 和 ANNS 的性能比较。
Neural Netw. 2020 Jan;121:294-307. doi: 10.1016/j.neunet.2019.09.005. Epub 2019 Sep 19.
8
Backpropagation-Based Learning Techniques for Deep Spiking Neural Networks: A Survey.基于反向传播的深度学习尖峰神经网络学习技术综述。
IEEE Trans Neural Netw Learn Syst. 2024 Sep;35(9):11906-11921. doi: 10.1109/TNNLS.2023.3263008. Epub 2024 Sep 3.
9
On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices.使用带有模拟突触器件的近似反向传播的片上训练脉冲神经网络。
Front Neurosci. 2020 Jul 7;14:423. doi: 10.3389/fnins.2020.00423. eCollection 2020.
10
Analyzing and Accelerating the Bottlenecks of Training Deep SNNs With Backpropagation.分析和加速基于反向传播的深度 SNN 训练的瓶颈。
Neural Comput. 2020 Dec;32(12):2557-2600. doi: 10.1162/neco_a_01319. Epub 2020 Sep 18.

引用本文的文献

1
BN-SNN: Spiking neural networks with bistable neurons for object detection.BN-SNN:用于目标检测的具有双稳态神经元的脉冲神经网络
PLoS One. 2025 Jul 10;20(7):e0327513. doi: 10.1371/journal.pone.0327513. eCollection 2025.

本文引用的文献

1
Brain-inspired global-local learning incorporated with neuromorphic computing.脑启发式全局-局部学习与神经形态计算相结合。
Nat Commun. 2022 Jan 10;13(1):65. doi: 10.1038/s41467-021-27653-2.
2
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.实现基于尖峰的反向传播以训练深度神经网络架构。
Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020.
3
Rethinking the performance comparison between SNNS and ANNS.重新思考 SNNS 和 ANNS 的性能比较。
Neural Netw. 2020 Jan;121:294-307. doi: 10.1016/j.neunet.2019.09.005. Epub 2019 Sep 19.
4
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures.深入探索脉冲神经网络:VGG和残差架构。
Front Neurosci. 2019 Mar 7;13:95. doi: 10.3389/fnins.2019.00095. eCollection 2019.
5
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning.通过基于STDP的无监督预训练和监督微调来训练深度脉冲卷积神经网络
Front Neurosci. 2018 Aug 3;12:435. doi: 10.3389/fnins.2018.00435. eCollection 2018.
6
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
7
Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification.将连续值深度网络转换为用于图像分类的高效事件驱动网络
Front Neurosci. 2017 Dec 7;11:682. doi: 10.3389/fnins.2017.00682. eCollection 2017.
8
Training Deep Spiking Neural Networks Using Backpropagation.使用反向传播训练深度脉冲神经网络。
Front Neurosci. 2016 Nov 8;10:508. doi: 10.3389/fnins.2016.00508. eCollection 2016.
9
Convolutional networks for fast, energy-efficient neuromorphic computing.用于快速、节能神经形态计算的卷积网络。
Proc Natl Acad Sci U S A. 2016 Oct 11;113(41):11441-11446. doi: 10.1073/pnas.1604850113. Epub 2016 Sep 20.
10
Leaky Integrate-and-Fire Neuron Circuit Based on Floating-Gate Integrator.基于浮栅积分器的漏电积分发放神经元电路
Front Neurosci. 2016 May 23;10:212. doi: 10.3389/fnins.2016.00212. eCollection 2016.