• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种基于时间编码的多层脉冲神经网络的监督学习算法,用于面向高能效超大规模集成电路处理器设计

A Supervised Learning Algorithm for Multilayer Spiking Neural Networks Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design.

作者信息

Sakemi Yusuke, Morino Kai, Morie Takashi, Aihara Kazuyuki

出版信息

IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):394-408. doi: 10.1109/TNNLS.2021.3095068. Epub 2023 Jan 5.

DOI:10.1109/TNNLS.2021.3095068
PMID:34280109
Abstract

Spiking neural networks (SNNs) are brain-inspired mathematical models with the ability to process information in the form of spikes. SNNs are expected to provide not only new machine-learning algorithms but also energy-efficient computational models when implemented in very-large-scale integration (VLSI) circuits. In this article, we propose a novel supervised learning algorithm for SNNs based on temporal coding. A spiking neuron in this algorithm is designed to facilitate analog VLSI implementations with analog resistive memory, by which ultrahigh energy efficiency can be achieved. We also propose several techniques to improve the performance on recognition tasks and show that the classification accuracy of the proposed algorithm is as high as that of the state-of-the-art temporal coding SNN algorithms on the MNIST and Fashion-MNIST datasets. Finally, we discuss the robustness of the proposed SNNs against variations that arise from the device manufacturing process and are unavoidable in analog VLSI implementation. We also propose a technique to suppress the effects of variations in the manufacturing process on the recognition performance.

摘要

脉冲神经网络(SNNs)是受大脑启发的数学模型,具有以脉冲形式处理信息的能力。当在超大规模集成(VLSI)电路中实现时,SNNs有望不仅提供新的机器学习算法,还能提供节能的计算模型。在本文中,我们提出了一种基于时间编码的新型SNNs监督学习算法。该算法中的脉冲神经元旨在通过模拟电阻式存储器促进模拟VLSI实现,从而实现超高的能量效率。我们还提出了几种提高识别任务性能的技术,并表明所提算法在MNIST和Fashion-MNIST数据集上的分类准确率与最先进的时间编码SNN算法一样高。最后,我们讨论了所提SNNs对模拟VLSI实现中因器件制造工艺而产生且不可避免的变化的鲁棒性。我们还提出了一种技术来抑制制造工艺变化对识别性能的影响。

相似文献

1
A Supervised Learning Algorithm for Multilayer Spiking Neural Networks Based on Temporal Coding Toward Energy-Efficient VLSI Processor Design.一种基于时间编码的多层脉冲神经网络的监督学习算法,用于面向高能效超大规模集成电路处理器设计
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):394-408. doi: 10.1109/TNNLS.2021.3095068. Epub 2023 Jan 5.
2
A TTFS-based energy and utilization efficient neuromorphic CNN accelerator.一种基于时间到第一个尖峰(TTFS)的能量与利用率高效的神经形态卷积神经网络加速器。
Front Neurosci. 2023 May 5;17:1121592. doi: 10.3389/fnins.2023.1121592. eCollection 2023.
3
Supervised Learning in Multilayer Spiking Neural Networks With Spike Temporal Error Backpropagation.监督学习在基于尖峰时间误差反向传播的多层尖峰神经网络中的应用。
IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):10141-10153. doi: 10.1109/TNNLS.2022.3164930. Epub 2023 Nov 30.
4
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
5
Supervised Learning in All FeFET-Based Spiking Neural Network: Opportunities and Challenges.基于全铁电场效应晶体管的脉冲神经网络中的监督学习:机遇与挑战。
Front Neurosci. 2020 Jun 24;14:634. doi: 10.3389/fnins.2020.00634. eCollection 2020.
6
Competitive Learning in a Spiking Neural Network: Towards an Intelligent Pattern Classifier.尖峰神经网络中的竞争学习:迈向智能模式分类器。
Sensors (Basel). 2020 Jan 16;20(2):500. doi: 10.3390/s20020500.
7
Trainable Reference Spikes Improve Temporal Information Processing of SNNs With Supervised Learning.可训练参考脉冲通过监督学习改善脉冲神经网络的时间信息处理。
Neural Comput. 2024 Sep 17;36(10):2136-2169. doi: 10.1162/neco_a_01702.
8
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
9
HybridSNN: Combining Bio-Machine Strengths by Boosting Adaptive Spiking Neural Networks.HybridSNN:通过提升自适应尖峰神经网络来结合生物机器的优势。
IEEE Trans Neural Netw Learn Syst. 2023 Sep;34(9):5841-5855. doi: 10.1109/TNNLS.2021.3131356. Epub 2023 Sep 1.
10
SpQuant-SNN: ultra-low precision membrane potential with sparse activations unlock the potential of on-device spiking neural networks applications.SpQuant-SNN:具有稀疏激活的超低精度膜电位开启了片上脉冲神经网络应用的潜力。
Front Neurosci. 2024 Sep 4;18:1440000. doi: 10.3389/fnins.2024.1440000. eCollection 2024.

引用本文的文献

1
Rethinking skip connections in Spiking Neural Networks with Time-To-First-Spike coding.基于首次放电时间编码对脉冲神经网络中的跳跃连接进行重新思考。
Front Neurosci. 2024 Feb 14;18:1346805. doi: 10.3389/fnins.2024.1346805. eCollection 2024.
2
Monitoring time domain characteristics of Parkinson's disease using 3D memristive neuromorphic system.使用3D忆阻神经形态系统监测帕金森病的时域特征
Front Comput Neurosci. 2023 Dec 15;17:1274575. doi: 10.3389/fncom.2023.1274575. eCollection 2023.
3
Sparse-firing regularization methods for spiking neural networks with time-to-first-spike coding.
用于具有首次放电时间编码的脉冲神经网络的稀疏放电正则化方法。
Sci Rep. 2023 Dec 21;13(1):22897. doi: 10.1038/s41598-023-50201-5.
4
Enhanced representation learning with temporal coding in sparsely spiking neural networks.稀疏脉冲神经网络中基于时间编码的增强表示学习
Front Comput Neurosci. 2023 Nov 21;17:1250908. doi: 10.3389/fncom.2023.1250908. eCollection 2023.
5
First-spike coding promotes accurate and efficient spiking neural networks for discrete events with rich temporal structures.首次尖峰编码促进了用于具有丰富时间结构的离散事件的精确且高效的尖峰神经网络。
Front Neurosci. 2023 Oct 2;17:1266003. doi: 10.3389/fnins.2023.1266003. eCollection 2023.
6
Adaptive STDP-based on-chip spike pattern detection.基于自适应尖峰时间依赖可塑性的片上尖峰模式检测。
Front Neurosci. 2023 Jul 13;17:1203956. doi: 10.3389/fnins.2023.1203956. eCollection 2023.
7
Analyzing time-to-first-spike coding schemes: A theoretical approach.分析首次尖峰时间编码方案:一种理论方法。
Front Neurosci. 2022 Sep 26;16:971937. doi: 10.3389/fnins.2022.971937. eCollection 2022.
8
A Synaptic Pruning-Based Spiking Neural Network for Hand-Written Digits Classification.一种基于突触修剪的用于手写数字分类的脉冲神经网络。
Front Artif Intell. 2022 Feb 24;5:680165. doi: 10.3389/frai.2022.680165. eCollection 2022.
9
Spiking Autoencoders With Temporal Coding.具有时间编码的脉冲自动编码器
Front Neurosci. 2021 Aug 13;15:712667. doi: 10.3389/fnins.2021.712667. eCollection 2021.