• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

CRBA:一种基于竞争性脉冲神经网络的基于竞争率的算法。

CRBA: A Competitive Rate-Based Algorithm Based on Competitive Spiking Neural Networks.

作者信息

Cachi Paolo G, Ventura Sebastián, Cios Krzysztof J

机构信息

Department of Computer Science, Virginia Commonwealth University, Richmond, VA, United States.

Department of Computer Science, Universidad de Córdoba, Córdoba, Spain.

出版信息

Front Comput Neurosci. 2021 Apr 22;15:627567. doi: 10.3389/fncom.2021.627567. eCollection 2021.

DOI:10.3389/fncom.2021.627567
PMID:33967726
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8100331/
Abstract

In this paper we present a Competitive Rate-Based Algorithm (CRBA) that approximates operation of a Competitive Spiking Neural Network (CSNN). CRBA is based on modeling of the competition between neurons during a sample presentation, which can be reduced to ranking of the neurons based on a dot product operation and the use of a discrete Expectation Maximization algorithm; the latter is equivalent to the spike time-dependent plasticity rule. CRBA's performance is compared with that of CSNN on the MNIST and Fashion-MNIST datasets. The results show that CRBA performs on par with CSNN, while using three orders of magnitude less computational time. Importantly, we show that the weights and firing thresholds learned by CRBA can be used to initialize CSNN's parameters that results in its much more efficient operation.

摘要

在本文中,我们提出了一种基于竞争率的算法(CRBA),该算法可近似竞争脉冲神经网络(CSNN)的运行。CRBA基于对样本呈现期间神经元之间竞争的建模,这可以简化为基于点积运算对神经元进行排序,并使用离散期望最大化算法;后者等同于脉冲时间依赖可塑性规则。在MNIST和Fashion-MNIST数据集上,将CRBA的性能与CSNN的性能进行了比较。结果表明,CRBA的性能与CSNN相当,同时计算时间少三个数量级。重要的是,我们表明,CRBA学习到的权重和发放阈值可用于初始化CSNN的参数,从而使其运行效率更高。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/152e4e3d8b49/fncom-15-627567-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/27b3fdcaad11/fncom-15-627567-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/1ba74e8312b0/fncom-15-627567-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/2f870c4680e4/fncom-15-627567-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/67c7d1380c5e/fncom-15-627567-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/bc7267744459/fncom-15-627567-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/98bf32b84f8a/fncom-15-627567-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/1d01ab569679/fncom-15-627567-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/34e2337934bf/fncom-15-627567-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/152e4e3d8b49/fncom-15-627567-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/27b3fdcaad11/fncom-15-627567-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/1ba74e8312b0/fncom-15-627567-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/2f870c4680e4/fncom-15-627567-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/67c7d1380c5e/fncom-15-627567-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/bc7267744459/fncom-15-627567-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/98bf32b84f8a/fncom-15-627567-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/1d01ab569679/fncom-15-627567-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/34e2337934bf/fncom-15-627567-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b34d/8100331/152e4e3d8b49/fncom-15-627567-g0009.jpg

相似文献

1
CRBA: A Competitive Rate-Based Algorithm Based on Competitive Spiking Neural Networks.CRBA:一种基于竞争性脉冲神经网络的基于竞争率的算法。
Front Comput Neurosci. 2021 Apr 22;15:627567. doi: 10.3389/fncom.2021.627567. eCollection 2021.
2
Hierarchical Bayesian Inference and Learning in Spiking Neural Networks.分层贝叶斯推断和尖峰神经网络中的学习。
IEEE Trans Cybern. 2019 Jan;49(1):133-145. doi: 10.1109/TCYB.2017.2768554. Epub 2017 Nov 9.
3
Recurrent Spiking Neural Network Learning Based on a Competitive Maximization of Neuronal Activity.基于神经元活动竞争最大化的递归脉冲神经网络学习
Front Neuroinform. 2018 Nov 15;12:79. doi: 10.3389/fninf.2018.00079. eCollection 2018.
4
Convolutional spiking neural networks for intent detection based on anticipatory brain potentials using electroencephalogram.基于脑电图的预测性脑电信号卷积尖峰神经网络的意图检测
Sci Rep. 2024 Apr 17;14(1):8850. doi: 10.1038/s41598-024-59469-7.
5
Biologically plausible deep learning - But how far can we go with shallow networks?生物学上合理的深度学习——但我们可以在浅层网络中走多远?
Neural Netw. 2019 Oct;118:90-101. doi: 10.1016/j.neunet.2019.06.001. Epub 2019 Jun 20.
6
Event-based backpropagation can compute exact gradients for spiking neural networks.基于事件的反向传播可以为脉冲神经网络计算精确的梯度。
Sci Rep. 2021 Jun 18;11(1):12829. doi: 10.1038/s41598-021-91786-z.
7
A biologically plausible supervised learning method for spiking neural networks using the symmetric STDP rule.基于对称 STDP 规则的尖峰神经网络的生物合理有监督学习方法。
Neural Netw. 2020 Jan;121:387-395. doi: 10.1016/j.neunet.2019.09.007. Epub 2019 Sep 27.
8
Neural Coding in Spiking Neural Networks: A Comparative Study for Robust Neuromorphic Systems.脉冲神经网络中的神经编码:对鲁棒神经形态系统的比较研究
Front Neurosci. 2021 Mar 4;15:638474. doi: 10.3389/fnins.2021.638474. eCollection 2021.
9
Competitive Learning in a Spiking Neural Network: Towards an Intelligent Pattern Classifier.尖峰神经网络中的竞争学习:迈向智能模式分类器。
Sensors (Basel). 2020 Jan 16;20(2):500. doi: 10.3390/s20020500.
10
Training Deep Spiking Neural Networks Using Backpropagation.使用反向传播训练深度脉冲神经网络。
Front Neurosci. 2016 Nov 8;10:508. doi: 10.3389/fnins.2016.00508. eCollection 2016.

本文引用的文献

1
Competitive Learning in a Spiking Neural Network: Towards an Intelligent Pattern Classifier.尖峰神经网络中的竞争学习:迈向智能模式分类器。
Sensors (Basel). 2020 Jan 16;20(2):500. doi: 10.3390/s20020500.
2
Unsupervised learning by competing hidden units.无监督竞争型隐单元学习。
Proc Natl Acad Sci U S A. 2019 Apr 16;116(16):7723-7731. doi: 10.1073/pnas.1820458116. Epub 2019 Mar 29.
3
Recurrent Spiking Neural Network Learning Based on a Competitive Maximization of Neuronal Activity.基于神经元活动竞争最大化的递归脉冲神经网络学习
Front Neuroinform. 2018 Nov 15;12:79. doi: 10.3389/fninf.2018.00079. eCollection 2018.
4
The impact of encoding-decoding schemes and weight normalization in spiking neural networks.编码-解码方案和权重归一化对尖峰神经网络的影响。
Neural Netw. 2018 Dec;108:365-378. doi: 10.1016/j.neunet.2018.08.024. Epub 2018 Sep 11.
5
Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification.将连续值深度网络转换为用于图像分类的高效事件驱动网络
Front Neurosci. 2017 Dec 7;11:682. doi: 10.3389/fnins.2017.00682. eCollection 2017.
6
STDP-based spiking deep convolutional neural networks for object recognition.基于 STDP 的尖峰深度卷积神经网络的目标识别。
Neural Netw. 2018 Mar;99:56-67. doi: 10.1016/j.neunet.2017.12.005. Epub 2017 Dec 23.
7
Unsupervised learning of digit recognition using spike-timing-dependent plasticity.使用基于脉冲时间依赖可塑性的无监督数字识别学习。
Front Comput Neurosci. 2015 Aug 3;9:99. doi: 10.3389/fncom.2015.00099. eCollection 2015.
8
Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks.多种突触可塑性机制协同作用,在脉冲神经网络中形成和检索记忆。
Nat Commun. 2015 Apr 21;6:6922. doi: 10.1038/ncomms7922.
9
Bayesian computation emerges in generic cortical microcircuits through spike-timing-dependent plasticity.贝叶斯计算通过依赖于尖峰时间的可塑性出现在一般的皮质微电路中。
PLoS Comput Biol. 2013 Apr;9(4):e1003037. doi: 10.1371/journal.pcbi.1003037. Epub 2013 Apr 25.
10
A sparse coding model with synaptically local plasticity and spiking neurons can account for the diverse shapes of V1 simple cell receptive fields.一种具有突触局部可塑性和放电神经元的稀疏编码模型可以解释 V1 简单细胞感受野的各种形状。
PLoS Comput Biol. 2011 Oct;7(10):e1002250. doi: 10.1371/journal.pcbi.1002250. Epub 2011 Oct 27.