• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

ReStoCNet:用于高效内存神经形态计算的残差随机二值卷积脉冲神经网络

ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing.

作者信息

Srinivasan Gopalakrishnan, Roy Kaushik

机构信息

Department of ECE, Purdue University, West Lafayette, IN, United States.

出版信息

Front Neurosci. 2019 Mar 19;13:189. doi: 10.3389/fnins.2019.00189. eCollection 2019.

DOI:10.3389/fnins.2019.00189
PMID:30941003
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6434391/
Abstract

In this work, we propose ReStoCNet, a residual stochastic multilayer convolutional Spiking Neural Network (SNN) composed of binary kernels, to reduce the synaptic memory footprint and enhance the computational efficiency of SNNs for complex pattern recognition tasks. ReStoCNet consists of an input layer followed by stacked convolutional layers for hierarchical input feature extraction, pooling layers for dimensionality reduction, and fully-connected layer for inference. In addition, we introduce residual connections between the stacked convolutional layers to improve the hierarchical feature learning capability of deep SNNs. We propose Spike Timing Dependent Plasticity (STDP) based probabilistic learning algorithm, referred to as Hybrid-STDP (HB-STDP), incorporating Hebbian and anti-Hebbian learning mechanisms, to train the binary kernels forming ReStoCNet in a layer-wise unsupervised manner. We demonstrate the efficacy of ReStoCNet and the presented HB-STDP based unsupervised training methodology on the MNIST and CIFAR-10 datasets. We show that residual connections enable the deeper convolutional layers to self-learn useful high-level input features and mitigate the accuracy loss observed in deep SNNs devoid of residual connections. The proposed ReStoCNet offers >20 × kernel memory compression compared to full-precision (32-bit) SNN while yielding high enough classification accuracy on the chosen pattern recognition tasks.

摘要

在这项工作中,我们提出了ReStoCNet,一种由二进制内核组成的残差随机多层卷积脉冲神经网络(SNN),以减少突触存储占用空间,并提高SNN在复杂模式识别任务中的计算效率。ReStoCNet由一个输入层、随后用于分层输入特征提取的堆叠卷积层、用于降维的池化层以及用于推理的全连接层组成。此外,我们在堆叠卷积层之间引入残差连接,以提高深度SNN的分层特征学习能力。我们提出了基于脉冲时间依赖可塑性(STDP)的概率学习算法,称为混合STDP(HB-STDP),它结合了赫布学习和反赫布学习机制,以逐层无监督的方式训练构成ReStoCNet的二进制内核。我们在MNIST和CIFAR-10数据集上展示了ReStoCNet和所提出的基于HB-STDP的无监督训练方法的有效性。我们表明,残差连接使更深的卷积层能够自学习有用的高级输入特征,并减轻在没有残差连接的深度SNN中观察到的精度损失。与全精度(32位)SNN相比,所提出的ReStoCNet提供了大于20倍的内核内存压缩,同时在所选的模式识别任务上产生了足够高的分类精度。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/3798e85eb27d/fnins-13-00189-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/ad00ea9bb69e/fnins-13-00189-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/c31863d7d781/fnins-13-00189-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/c2e3091c1055/fnins-13-00189-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/29c279263e27/fnins-13-00189-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/64f8a6b259eb/fnins-13-00189-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/a6b299bda436/fnins-13-00189-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/e199d104f49b/fnins-13-00189-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/598af4fa2b69/fnins-13-00189-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/ee7cf9b23534/fnins-13-00189-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/7e39d4923388/fnins-13-00189-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/3798e85eb27d/fnins-13-00189-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/ad00ea9bb69e/fnins-13-00189-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/c31863d7d781/fnins-13-00189-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/c2e3091c1055/fnins-13-00189-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/29c279263e27/fnins-13-00189-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/64f8a6b259eb/fnins-13-00189-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/a6b299bda436/fnins-13-00189-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/e199d104f49b/fnins-13-00189-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/598af4fa2b69/fnins-13-00189-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/ee7cf9b23534/fnins-13-00189-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/7e39d4923388/fnins-13-00189-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c3c5/6434391/3798e85eb27d/fnins-13-00189-g0011.jpg

相似文献

1
ReStoCNet: Residual Stochastic Binary Convolutional Spiking Neural Network for Memory-Efficient Neuromorphic Computing.ReStoCNet:用于高效内存神经形态计算的残差随机二值卷积脉冲神经网络
Front Neurosci. 2019 Mar 19;13:189. doi: 10.3389/fnins.2019.00189. eCollection 2019.
2
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning.通过基于STDP的无监督预训练和监督微调来训练深度脉冲卷积神经网络
Front Neurosci. 2018 Aug 3;12:435. doi: 10.3389/fnins.2018.00435. eCollection 2018.
3
SSTDP: Supervised Spike Timing Dependent Plasticity for Efficient Spiking Neural Network Training.SSTDP:用于高效脉冲神经网络训练的监督式脉冲时间依赖可塑性
Front Neurosci. 2021 Nov 4;15:756876. doi: 10.3389/fnins.2021.756876. eCollection 2021.
4
SpiLinC: Spiking Liquid-Ensemble Computing for Unsupervised Speech and Image Recognition.SpiLinC:用于无监督语音和图像识别的脉冲液体集成计算
Front Neurosci. 2018 Aug 23;12:524. doi: 10.3389/fnins.2018.00524. eCollection 2018.
5
MONETA: A Processing-In-Memory-Based Hardware Platform for the Hybrid Convolutional Spiking Neural Network With Online Learning.MONETA:一种用于具有在线学习功能的混合卷积脉冲神经网络的基于内存处理的硬件平台。
Front Neurosci. 2022 Apr 11;16:775457. doi: 10.3389/fnins.2022.775457. eCollection 2022.
6
An unsupervised STDP-based spiking neural network inspired by biologically plausible learning rules and connections.一种基于无监督 STDP 的尖峰神经网络,灵感来自于具有生物学合理性的学习规则和连接。
Neural Netw. 2023 Aug;165:799-808. doi: 10.1016/j.neunet.2023.06.019. Epub 2023 Jun 22.
7
Paired competing neurons improving STDP supervised local learning in Spiking Neural Networks.配对竞争神经元改善脉冲神经网络中基于STDP的监督局部学习
Front Neurosci. 2024 Jul 24;18:1401690. doi: 10.3389/fnins.2024.1401690. eCollection 2024.
8
[A bio-inspired hierarchical spiking neural network with biological synaptic plasticity for event camera object recognition].一种具有生物突触可塑性的用于事件相机目标识别的生物启发式分层脉冲神经网络
Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. 2023 Aug 25;40(4):692-699. doi: 10.7507/1001-5515.202207040.
9
ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator.ALBSNN:具有精度损失估计器的超低延迟自适应局部二值脉冲神经网络
Front Neurosci. 2023 Sep 13;17:1225871. doi: 10.3389/fnins.2023.1225871. eCollection 2023.
10
A Heterogeneous Spiking Neural Network for Unsupervised Learning of Spatiotemporal Patterns.一种用于时空模式无监督学习的异构脉冲神经网络。
Front Neurosci. 2021 Jan 14;14:615756. doi: 10.3389/fnins.2020.615756. eCollection 2020.

引用本文的文献

1
Encrypted Spiking Neural Networks Based on Adaptive Differential Privacy Mechanism.基于自适应差分隐私机制的加密脉冲神经网络
Entropy (Basel). 2025 Mar 22;27(4):333. doi: 10.3390/e27040333.
2
Paired competing neurons improving STDP supervised local learning in Spiking Neural Networks.配对竞争神经元改善脉冲神经网络中基于STDP的监督局部学习
Front Neurosci. 2024 Jul 24;18:1401690. doi: 10.3389/fnins.2024.1401690. eCollection 2024.
3
ALBSNN: ultra-low latency adaptive local binary spiking neural network with accuracy loss estimator.

本文引用的文献

1
Spiking Deep Residual Networks.尖峰深度残差网络
IEEE Trans Neural Netw Learn Syst. 2023 Aug;34(8):5200-5205. doi: 10.1109/TNNLS.2021.3119238. Epub 2023 Aug 4.
2
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures.深入探索脉冲神经网络:VGG和残差架构。
Front Neurosci. 2019 Mar 7;13:95. doi: 10.3389/fnins.2019.00095. eCollection 2019.
3
Deep Learning With Spiking Neurons: Opportunities and Challenges.基于脉冲神经元的深度学习:机遇与挑战。
ALBSNN:具有精度损失估计器的超低延迟自适应局部二值脉冲神经网络
Front Neurosci. 2023 Sep 13;17:1225871. doi: 10.3389/fnins.2023.1225871. eCollection 2023.
4
Memcapacitor Crossbar Array with Charge Trap NAND Flash Structure for Neuromorphic Computing.用于神经形态计算的具有电荷陷阱型NAND闪存结构的忆阻器交叉阵列
Adv Sci (Weinh). 2023 Nov;10(32):e2303817. doi: 10.1002/advs.202303817. Epub 2023 Sep 26.
5
BlocTrain: Block-Wise Conditional Training and Inference for Efficient Spike-Based Deep Learning.BlocTrain:用于高效基于脉冲的深度学习的逐块条件训练与推理
Front Neurosci. 2021 Oct 29;15:603433. doi: 10.3389/fnins.2021.603433. eCollection 2021.
6
[A review of brain-like spiking neural network and its neuromorphic chip research].[类脑脉冲神经网络及其神经形态芯片研究综述]
Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. 2021 Oct 25;38(5):986-994. doi: 10.7507/1001-5515.202011005.
7
A Cost-Efficient High-Speed VLSI Architecture for Spiking Convolutional Neural Network Inference Using Time-Step Binary Spike Maps.基于时间步长二值化 Spike 图的 Spike 卷积神经网络推理的高效高速 VLSI 架构
Sensors (Basel). 2021 Sep 8;21(18):6006. doi: 10.3390/s21186006.
8
Stochastic binary synapses having sigmoidal cumulative distribution functions for unsupervised learning with spike timing-dependent plasticity.具有 sigmoidal 累积分布函数的随机二进制突触用于具有尖峰时间依赖性可塑性的无监督学习。
Sci Rep. 2021 Sep 14;11(1):18282. doi: 10.1038/s41598-021-97583-y.
9
End-to-End Implementation of Various Hybrid Neural Networks on a Cross-Paradigm Neuromorphic Chip.跨范式神经形态芯片上各种混合神经网络的端到端实现
Front Neurosci. 2021 Feb 2;15:615279. doi: 10.3389/fnins.2021.615279. eCollection 2021.
10
A Heterogeneous Spiking Neural Network for Unsupervised Learning of Spatiotemporal Patterns.一种用于时空模式无监督学习的异构脉冲神经网络。
Front Neurosci. 2021 Jan 14;14:615756. doi: 10.3389/fnins.2020.615756. eCollection 2020.
Front Neurosci. 2018 Oct 25;12:774. doi: 10.3389/fnins.2018.00774. eCollection 2018.
4
On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights.关于具有1位突触权重的随机STDP硬件的实际问题。
Front Neurosci. 2018 Oct 15;12:665. doi: 10.3389/fnins.2018.00665. eCollection 2018.
5
Training Deep Spiking Convolutional Neural Networks With STDP-Based Unsupervised Pre-training Followed by Supervised Fine-Tuning.通过基于STDP的无监督预训练和监督微调来训练深度脉冲卷积神经网络
Front Neurosci. 2018 Aug 3;12:435. doi: 10.3389/fnins.2018.00435. eCollection 2018.
6
Event-Based, Timescale Invariant Unsupervised Online Deep Learning With STDP.基于事件的、时间尺度不变的带STDP的无监督在线深度学习
Front Comput Neurosci. 2018 Jun 14;12:46. doi: 10.3389/fncom.2018.00046. eCollection 2018.
7
Spatio-Temporal Backpropagation for Training High-Performance Spiking Neural Networks.用于训练高性能脉冲神经网络的时空反向传播
Front Neurosci. 2018 May 23;12:331. doi: 10.3389/fnins.2018.00331. eCollection 2018.
8
Unsupervised Feature Learning With Winner-Takes-All Based STDP.基于胜者全得的脉冲时间依赖可塑性的无监督特征学习
Front Comput Neurosci. 2018 Apr 5;12:24. doi: 10.3389/fncom.2018.00024. eCollection 2018.
9
STDP-based spiking deep convolutional neural networks for object recognition.基于 STDP 的尖峰深度卷积神经网络的目标识别。
Neural Netw. 2018 Mar;99:56-67. doi: 10.1016/j.neunet.2017.12.005. Epub 2017 Dec 23.
10
Supervised Learning Based on Temporal Coding in Spiking Neural Networks.基于脉冲神经网络中时间编码的监督学习。
IEEE Trans Neural Netw Learn Syst. 2018 Jul;29(7):3227-3235. doi: 10.1109/TNNLS.2017.2726060. Epub 2017 Aug 1.