• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

亚静息膜电位对尖峰神经网络准确推断的影响。

Impact of the Sub-Resting Membrane Potential on Accurate Inference in Spiking Neural Networks.

机构信息

Inter-university Semiconductor Research Center (ISRC) and Department of Electrical and Computer Engineering, Seoul National University, Seoul, 08826, Republic of Korea.

出版信息

Sci Rep. 2020 Feb 26;10(1):3515. doi: 10.1038/s41598-020-60572-8.

DOI:10.1038/s41598-020-60572-8
PMID:32103126
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7044207/
Abstract

Spiking neural networks (SNNs) are considered as the third generation of artificial neural networks, having the potential to improve the energy efficiency of conventional computing systems. Although the firing rate of a spiking neuron is an approximation of rectified linear unit (ReLU) activation in an analog-valued neural network (ANN), there remain many challenges to be overcome owing to differences in operation between ANNs and SNNs. Unlike actual biological and biophysical processes, various hardware implementations of neurons and SNNs do not allow the membrane potential to fall below the resting potential-in other words, neurons must allow the sub-resting membrane potential. Because there occur an excitatory post-synaptic potential (EPSP) as well as an inhibitory post-synaptic potential (IPSP), negatively valued synaptic weights in SNNs induce the sub-resting membrane potential at some time point. If a membrane is not allowed to hold the sub-resting potential, errors will accumulate over time, resulting in inaccurate inference operations. This phenomenon is not observed in ANNs given their use of only spatial synaptic integration, but it can cause serious performance degradation in SNNs. In this paper, we demonstrate the impact of the sub-resting membrane potential on accurate inference operations in SNNs. Moreover, several important considerations for a hardware SNN that can maintain the sub-resting membrane potential are discussed. All of the results in this paper indicate that it is essential for neurons to allow the sub-resting membrane potential in order to realize high-performance SNNs.

摘要

尖峰神经网络(SNN)被认为是第三代人工神经网络,有可能提高传统计算系统的能效。虽然尖峰神经元的发放率是模拟值神经网络(ANN)中整流线性单元(ReLU)激活的一种近似,但由于 ANN 和 SNN 之间的操作差异,仍有许多挑战需要克服。与实际的生物和生物物理过程不同,神经元和 SNN 的各种硬件实现不允许膜电位降至静息电位以下——换句话说,神经元必须允许亚静息膜电位。由于存在兴奋性突触后电位(EPSP)和抑制性突触后电位(IPSP),SNN 中的负值突触权重会在某些时刻诱导亚静息膜电位。如果不允许膜保持亚静息电位,误差会随时间累积,导致不准确的推断操作。由于 ANN 仅使用空间突触整合,因此不会观察到这种现象,但它会导致 SNN 的性能严重下降。在本文中,我们展示了亚静息膜电位对 SNN 中准确推断操作的影响。此外,还讨论了能够维持亚静息膜电位的硬件 SNN 的几个重要注意事项。本文中的所有结果都表明,为了实现高性能的 SNN,神经元允许亚静息膜电位是至关重要的。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/01bc8a9dab7b/41598_2020_60572_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/70d9f7a902c5/41598_2020_60572_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/1439056273e9/41598_2020_60572_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/2b73f464ec4d/41598_2020_60572_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/01bc8a9dab7b/41598_2020_60572_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/70d9f7a902c5/41598_2020_60572_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/1439056273e9/41598_2020_60572_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/2b73f464ec4d/41598_2020_60572_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d966/7044207/01bc8a9dab7b/41598_2020_60572_Fig4_HTML.jpg

相似文献

1
Impact of the Sub-Resting Membrane Potential on Accurate Inference in Spiking Neural Networks.亚静息膜电位对尖峰神经网络准确推断的影响。
Sci Rep. 2020 Feb 26;10(1):3515. doi: 10.1038/s41598-020-60572-8.
2
Low-Latency Spiking Neural Networks Using Pre-Charged Membrane Potential and Delayed Evaluation.使用预充电膜电位和延迟评估的低延迟脉冲神经网络。
Front Neurosci. 2021 Feb 18;15:629000. doi: 10.3389/fnins.2021.629000. eCollection 2021.
3
A universal ANN-to-SNN framework for achieving high accuracy and low latency deep Spiking Neural Networks.一种通用的 ANN-to-SNN 框架,可实现高精度和低延迟的深度尖峰神经网络。
Neural Netw. 2024 Jun;174:106244. doi: 10.1016/j.neunet.2024.106244. Epub 2024 Mar 15.
4
Quantization Framework for Fast Spiking Neural Networks.快速脉冲神经网络的量化框架
Front Neurosci. 2022 Jul 19;16:918793. doi: 10.3389/fnins.2022.918793. eCollection 2022.
5
Rethinking the performance comparison between SNNS and ANNS.重新思考 SNNS 和 ANNS 的性能比较。
Neural Netw. 2020 Jan;121:294-307. doi: 10.1016/j.neunet.2019.09.005. Epub 2019 Sep 19.
6
High-accuracy deep ANN-to-SNN conversion using quantization-aware training framework and calcium-gated bipolar leaky integrate and fire neuron.使用量化感知训练框架和钙门控双极泄漏积分发放神经元实现高精度深度人工神经网络到脉冲神经网络的转换。
Front Neurosci. 2023 Mar 8;17:1141701. doi: 10.3389/fnins.2023.1141701. eCollection 2023.
7
A TTFS-based energy and utilization efficient neuromorphic CNN accelerator.一种基于时间到第一个尖峰(TTFS)的能量与利用率高效的神经形态卷积神经网络加速器。
Front Neurosci. 2023 May 5;17:1121592. doi: 10.3389/fnins.2023.1121592. eCollection 2023.
8
Attention Spiking Neural Networks.关注脉冲神经网络。
IEEE Trans Pattern Anal Mach Intell. 2023 Aug;45(8):9393-9410. doi: 10.1109/TPAMI.2023.3241201. Epub 2023 Jun 30.
9
Sparse Computation in Adaptive Spiking Neural Networks.自适应脉冲神经网络中的稀疏计算
Front Neurosci. 2019 Jan 8;12:987. doi: 10.3389/fnins.2018.00987. eCollection 2018.
10
Progressive Tandem Learning for Pattern Recognition With Deep Spiking Neural Networks.深度尖峰神经网络中用于模式识别的渐进式串联学习。
IEEE Trans Pattern Anal Mach Intell. 2022 Nov;44(11):7824-7840. doi: 10.1109/TPAMI.2021.3114196. Epub 2022 Oct 4.

引用本文的文献

1
Memcapacitor Crossbar Array with Charge Trap NAND Flash Structure for Neuromorphic Computing.用于神经形态计算的具有电荷陷阱型NAND闪存结构的忆阻器交叉阵列
Adv Sci (Weinh). 2023 Nov;10(32):e2303817. doi: 10.1002/advs.202303817. Epub 2023 Sep 26.
2
Toward robust and scalable deep spiking reinforcement learning.迈向稳健且可扩展的深度脉冲强化学习。
Front Neurorobot. 2023 Jan 20;16:1075647. doi: 10.3389/fnbot.2022.1075647. eCollection 2022.
3
Low-Latency Spiking Neural Networks Using Pre-Charged Membrane Potential and Delayed Evaluation.

本文引用的文献

1
Training Deep Spiking Neural Networks Using Backpropagation.使用反向传播训练深度脉冲神经网络。
Front Neurosci. 2016 Nov 8;10:508. doi: 10.3389/fnins.2016.00508. eCollection 2016.
使用预充电膜电位和延迟评估的低延迟脉冲神经网络。
Front Neurosci. 2021 Feb 18;15:629000. doi: 10.3389/fnins.2021.629000. eCollection 2021.