• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

重新思考标准化和残差块在尖峰神经网络中的作用。

Rethinking the Role of Normalization and Residual Blocks for Spiking Neural Networks.

机构信息

Tokyo Research Center, Aisin Corporation, Akihabara Daibiru 7F 1-18-13, Sotokanda, Chiyoda-ku, Tokyo 101-0021, Japan.

AISIN SOFTWARE Co., Ltd., Advance Square Kariya 7F 1-1-1, Aioicho, Kariya 448-0027, Aichi, Japan.

出版信息

Sensors (Basel). 2022 Apr 8;22(8):2876. doi: 10.3390/s22082876.

DOI:10.3390/s22082876
PMID:35458860
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9028401/
Abstract

Biologically inspired spiking neural networks (SNNs) are widely used to realize ultralow-power energy consumption. However, deep SNNs are not easy to train due to the excessive firing of spiking neurons in the hidden layers. To tackle this problem, we propose a novel but simple normalization technique called postsynaptic potential normalization. This normalization removes the subtraction term from the standard normalization and uses the second raw moment instead of the variance as the division term. The spike firing can be controlled, enabling the training to proceed appropriately, by conducting this simple normalization to the postsynaptic potential. The experimental results show that SNNs with our normalization outperformed other models using other normalizations. Furthermore, through the pre-activation residual blocks, the proposed model can train with more than 100 layers without other special techniques dedicated to SNNs.

摘要

受生物启发的尖峰神经网络 (SNN) 被广泛用于实现超低功耗能耗。然而,由于隐藏层中尖峰神经元的过度激发,深度 SNN 不易训练。为了解决这个问题,我们提出了一种新颖但简单的归一化技术,称为突触后电位归一化。这种归一化从标准归一化中删除减法项,并使用二阶原始矩而不是方差作为除法项。通过对突触后电位进行这种简单的归一化,可以控制尖峰放电,使训练能够适当进行。实验结果表明,使用我们的归一化的 SNN 优于使用其他归一化的其他模型。此外,通过预激活残差块,所提出的模型可以在没有其他专门用于 SNN 的特殊技术的情况下训练超过 100 层。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/e0a31597d9a1/sensors-22-02876-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/b8d418637e4f/sensors-22-02876-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/ad4e64ff38f8/sensors-22-02876-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/5f7a01839796/sensors-22-02876-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/094a3f7effaa/sensors-22-02876-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/a7b33b256410/sensors-22-02876-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/98fc1ce2328a/sensors-22-02876-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/c10ffebad143/sensors-22-02876-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/ba26d643eedb/sensors-22-02876-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/57d2ea2fe7f2/sensors-22-02876-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/46eb6202071a/sensors-22-02876-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/5a033468a574/sensors-22-02876-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/e0a31597d9a1/sensors-22-02876-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/b8d418637e4f/sensors-22-02876-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/ad4e64ff38f8/sensors-22-02876-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/5f7a01839796/sensors-22-02876-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/094a3f7effaa/sensors-22-02876-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/a7b33b256410/sensors-22-02876-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/98fc1ce2328a/sensors-22-02876-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/c10ffebad143/sensors-22-02876-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/ba26d643eedb/sensors-22-02876-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/57d2ea2fe7f2/sensors-22-02876-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/46eb6202071a/sensors-22-02876-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/5a033468a574/sensors-22-02876-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1f6/9028401/e0a31597d9a1/sensors-22-02876-g012.jpg

相似文献

1
Rethinking the Role of Normalization and Residual Blocks for Spiking Neural Networks.重新思考标准化和残差块在尖峰神经网络中的作用。
Sensors (Basel). 2022 Apr 8;22(8):2876. doi: 10.3390/s22082876.
2
Short-Term Memory Impairment短期记忆障碍
3
The architecture design and training optimization of spiking neural network with low-latency and high-performance for classification and segmentation.用于分类和分割的具有低延迟和高性能的脉冲神经网络的架构设计与训练优化。
Neural Netw. 2025 Jun 21;191:107790. doi: 10.1016/j.neunet.2025.107790.
4
Automated devices for identifying peripheral arterial disease in people with leg ulceration: an evidence synthesis and cost-effectiveness analysis.用于识别下肢溃疡患者外周动脉疾病的自动化设备:证据综合和成本效益分析。
Health Technol Assess. 2024 Aug;28(37):1-158. doi: 10.3310/TWCG3912.
5
Gonadotropin-releasing hormone (GnRH) analogues for premenstrual syndrome (PMS).用于经前综合征(PMS)的促性腺激素释放激素(GnRH)类似物。
Cochrane Database Syst Rev. 2025 Jun 10;6(6):CD011330. doi: 10.1002/14651858.CD011330.pub2.
6
Management of urinary stones by experts in stone disease (ESD 2025).结石病专家对尿路结石的管理(2025年结石病专家共识)
Arch Ital Urol Androl. 2025 Jun 30;97(2):14085. doi: 10.4081/aiua.2025.14085.
7
Sexual Harassment and Prevention Training性骚扰与预防培训
8
Sympathetic nerve blocks for persistent pain in adults with inoperable abdominopelvic cancer.成人无法手术的腹盆腔癌症持续性疼痛的交感神经阻滞。
Cochrane Database Syst Rev. 2024 Jun 6;6(6):CD015229. doi: 10.1002/14651858.CD015229.pub2.
9
The Lived Experience of Autistic Adults in Employment: A Systematic Search and Synthesis.成年自闭症患者的就业生活经历:系统检索与综述
Autism Adulthood. 2024 Dec 2;6(4):495-509. doi: 10.1089/aut.2022.0114. eCollection 2024 Dec.
10
Idiopathic (Genetic) Generalized Epilepsy特发性(遗传性)全身性癫痫

引用本文的文献

1
Rethinking skip connections in Spiking Neural Networks with Time-To-First-Spike coding.基于首次放电时间编码对脉冲神经网络中的跳跃连接进行重新思考。
Front Neurosci. 2024 Feb 14;18:1346805. doi: 10.3389/fnins.2024.1346805. eCollection 2024.
2
Direct learning-based deep spiking neural networks: a review.基于直接学习的深度脉冲神经网络综述
Front Neurosci. 2023 Jun 16;17:1209795. doi: 10.3389/fnins.2023.1209795. eCollection 2023.

本文引用的文献

1
Revisiting Batch Normalization for Training Low-Latency Deep Spiking Neural Networks From Scratch.从头开始训练低延迟深度脉冲神经网络时重新审视批量归一化
Front Neurosci. 2021 Dec 9;15:773954. doi: 10.3389/fnins.2021.773954. eCollection 2021.
2
A Correspondence Between Normalization Strategies in Artificial and Biological Neural Networks.人工神经网络与生物神经网络的归一化策略的对应关系。
Neural Comput. 2021 Nov 12;33(12):3179-3203. doi: 10.1162/neco_a_01439.
3
Enabling Spike-Based Backpropagation for Training Deep Neural Network Architectures.
实现基于尖峰的反向传播以训练深度神经网络架构。
Front Neurosci. 2020 Feb 28;14:119. doi: 10.3389/fnins.2020.00119. eCollection 2020.
4
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures.深入探索脉冲神经网络:VGG和残差架构。
Front Neurosci. 2019 Mar 7;13:95. doi: 10.3389/fnins.2019.00095. eCollection 2019.
5
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
6
SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks.超级脉冲:多层脉冲神经网络中的监督学习
Neural Comput. 2018 Jun;30(6):1514-1541. doi: 10.1162/neco_a_01086. Epub 2018 Apr 13.
7
Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification.将连续值深度网络转换为用于图像分类的高效事件驱动网络
Front Neurosci. 2017 Dec 7;11:682. doi: 10.3389/fnins.2017.00682. eCollection 2017.
8
Unsupervised learning of digit recognition using spike-timing-dependent plasticity.使用基于脉冲时间依赖可塑性的无监督数字识别学习。
Front Comput Neurosci. 2015 Aug 3;9:99. doi: 10.3389/fncom.2015.00099. eCollection 2015.
9
Converting Static Image Datasets to Spiking Neuromorphic Datasets Using Saccades.利用扫视将静态图像数据集转换为脉冲神经形态数据集
Front Neurosci. 2015 Nov 16;9:437. doi: 10.3389/fnins.2015.00437. eCollection 2015.
10
Simple model of spiking neurons.脉冲神经元的简单模型。
IEEE Trans Neural Netw. 2003;14(6):1569-72. doi: 10.1109/TNN.2003.820440.