• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

尖峰神经元的赫布和 STDP 学习权重的约束。

Constraints on Hebbian and STDP learned weights of a spiking neuron.

机构信息

CEMS, School of Computing, University of Kent, CT2 7NF, Canterbury, UK.

CEMS, School of Computing, University of Kent, CT2 7NF, Canterbury, UK.

出版信息

Neural Netw. 2021 Mar;135:192-200. doi: 10.1016/j.neunet.2020.12.012. Epub 2021 Jan 2.

DOI:10.1016/j.neunet.2020.12.012
PMID:33401225
Abstract

We analyse mathematically the constraints on weights resulting from Hebbian and STDP learning rules applied to a spiking neuron with weight normalisation. In the case of pure Hebbian learning, we find that the normalised weights equal the promotion probabilities of weights up to correction terms that depend on the learning rate and are usually small. A similar relation can be derived for STDP algorithms, where the normalised weight values reflect a difference between the promotion and demotion probabilities of the weight. These relations are practically useful in that they allow checking for convergence of Hebbian and STDP algorithms. Another application is novelty detection. We demonstrate this using the MNIST dataset.

摘要

我们分析了应用于具有权重归一化的尖峰神经元的赫布和 STDP 学习规则所产生的权重约束。在纯赫布学习的情况下,我们发现归一化权重等于权重的促进概率,其修正项取决于学习率且通常很小。对于 STDP 算法,也可以推导出类似的关系,其中归一化权重值反映了权重促进和抑制概率之间的差异。这些关系在实践中非常有用,因为它们可以检查赫布和 STDP 算法的收敛性。另一个应用是新颖性检测。我们使用 MNIST 数据集对此进行了演示。

相似文献

1
Constraints on Hebbian and STDP learned weights of a spiking neuron.尖峰神经元的赫布和 STDP 学习权重的约束。
Neural Netw. 2021 Mar;135:192-200. doi: 10.1016/j.neunet.2020.12.012. Epub 2021 Jan 2.
2
Spike timing-dependent plasticity induces non-trivial topology in the brain.尖峰时间依赖可塑性在大脑中诱导出非平凡拓扑结构。
Neural Netw. 2017 Apr;88:58-64. doi: 10.1016/j.neunet.2017.01.010. Epub 2017 Jan 31.
3
What can a neuron learn with spike-timing-dependent plasticity?神经元通过尖峰时间依赖性可塑性能够学习什么?
Neural Comput. 2005 Nov;17(11):2337-82. doi: 10.1162/0899766054796888.
4
Neuromodulated Spike-Timing-Dependent Plasticity, and Theory of Three-Factor Learning Rules.神经调节的尖峰时间依赖性可塑性及三因素学习规则理论
Front Neural Circuits. 2016 Jan 19;9:85. doi: 10.3389/fncir.2015.00085. eCollection 2015.
5
Competitive STDP Learning of Overlapping Spatial Patterns.重叠空间模式的竞争性尖峰时间依赖可塑性学习
Neural Comput. 2015 Aug;27(8):1673-85. doi: 10.1162/NECO_a_00753. Epub 2015 Jun 16.
6
Representation of input structure in synaptic weights by spike-timing-dependent plasticity.通过尖峰时间依赖可塑性在突触权重中表征输入结构。
Phys Rev E Stat Nonlin Soft Matter Phys. 2010 Aug;82(2 Pt 1):021912. doi: 10.1103/PhysRevE.82.021912. Epub 2010 Aug 13.
7
Laguerre-Volterra identification of spike-timing-dependent plasticity from spiking activity: a simulation study.基于尖峰活动的拉盖尔-沃尔泰拉方法识别尖峰时间依赖性可塑性:一项模拟研究
Annu Int Conf IEEE Eng Med Biol Soc. 2013;2013:5578-81. doi: 10.1109/EMBC.2013.6610814.
8
STDP-based spiking deep convolutional neural networks for object recognition.基于 STDP 的尖峰深度卷积神经网络的目标识别。
Neural Netw. 2018 Mar;99:56-67. doi: 10.1016/j.neunet.2017.12.005. Epub 2017 Dec 23.
9
Reconciling the STDP and BCM models of synaptic plasticity in a spiking recurrent neural network.在一个尖峰循环神经网络中协调 STDP 和 BCM 模型的突触可塑性。
Neural Comput. 2010 Aug;22(8):2059-85. doi: 10.1162/NECO_a_00003-Bush.
10
Spectral analysis of input spike trains by spike-timing-dependent plasticity.基于时间依赖的可塑性对输入尖峰序列的频谱分析。
PLoS Comput Biol. 2012;8(7):e1002584. doi: 10.1371/journal.pcbi.1002584. Epub 2012 Jul 5.

引用本文的文献

1
Rewiring the disordered connectome with circuit-based paired stimulation after stroke-a randomized, double-blind and controlled Phase II trial.中风后基于回路的配对刺激重塑紊乱的脑连接组——一项随机、双盲和对照的II期试验
Brain Commun. 2024 Dec 4;6(6):fcae437. doi: 10.1093/braincomms/fcae437. eCollection 2024.
2
Inhibitory neurons control the consolidation of neural assemblies via adaptation to selective stimuli.抑制性神经元通过适应选择性刺激来控制神经集合体的巩固。
Sci Rep. 2023 Apr 28;13(1):6949. doi: 10.1038/s41598-023-34165-0.
3
Applying the Properties of Neurons in Machine Learning: A Brain-like Neural Model with Interactive Stimulation for Data Classification.
神经元特性在机器学习中的应用:一种具有交互式刺激的数据分类类脑神经模型。
Brain Sci. 2022 Sep 3;12(9):1191. doi: 10.3390/brainsci12091191.