Uleru George-Iulian, Hulea Mircea, Barleanu Alexandru
Department of Computer Engineering, Gheorghe Asachi Technical University of Iași, Dimitrie Mangeron 27, 700050 Iași, Romania.
Biomimetics (Basel). 2023 Jan 11;8(1):28. doi: 10.3390/biomimetics8010028.
The main advantages of spiking neural networks are the high biological plausibility and their fast response due to spiking behaviour. The response time decreases significantly in the hardware implementation of SNN because the neurons operate in parallel. Compared with the traditional computational neural network, the SNN use a lower number of neurons, which also reduces their cost. Another critical characteristic of SNN is their ability to learn by event association that is determined mainly by postsynaptic mechanisms such as long-term potentiation. However, in some conditions, presynaptic plasticity determined by post-tetanic potentiation occurs due to the fast activation of presynaptic neurons. This violates the Hebbian learning rules that are specific to postsynaptic plasticity. Hebbian learning improves the SNN ability to discriminate the neural paths trained by the temporal association of events, which is the key element of learning in the brain. This paper quantifies the efficiency of Hebbian learning as the ratio between the LTP and PTP effects on the synaptic weights. On the basis of this new idea, this work evaluates for the first time the influence of the number of neurons on the PTP/LTP ratio and consequently on the Hebbian learning efficiency. The evaluation was performed by simulating a neuron model that was successfully tested in control applications. The results show that the firing rate of postsynaptic neurons post depends on the number of presynaptic neurons pre, which increases the effect of LTP on the synaptic potentiation. When post activates at a requested rate, the learning efficiency varies in the opposite direction with the number of pres, reaching its maximum when fewer than two pres are used. In addition, Hebbian learning is more efficient at lower presynaptic firing rates that are divisors of the target frequency of post. This study concluded that, when the electronic neurons additionally model presynaptic plasticity to LTP, the efficiency of Hebbian learning is higher when fewer neurons are used. This result strengthens the observations of our previous research where the SNN with a reduced number of neurons could successfully learn to control the motion of robotic fingers.
脉冲神经网络的主要优点是具有高度的生物合理性,并且由于脉冲行为而具有快速响应能力。在脉冲神经网络的硬件实现中,响应时间显著缩短,因为神经元是并行运行的。与传统计算神经网络相比,脉冲神经网络使用的神经元数量更少,这也降低了成本。脉冲神经网络的另一个关键特性是它们能够通过事件关联进行学习,这主要由诸如长时程增强等突触后机制决定。然而,在某些情况下,由于突触前神经元的快速激活,会出现由强直后增强决定的突触前可塑性。这违反了特定于突触后可塑性的赫布学习规则。赫布学习提高了脉冲神经网络区分由事件时间关联训练的神经路径的能力,这是大脑学习的关键要素。本文将赫布学习的效率量化为长时程增强和强直后增强对突触权重影响的比率。基于这一新理念,本研究首次评估了神经元数量对强直后增强/长时程增强比率的影响,进而对赫布学习效率的影响。评估是通过模拟一个在控制应用中成功测试的神经元模型进行的。结果表明,突触后神经元的放电率取决于突触前神经元的数量,这增加了长时程增强对突触增强的影响。当突触后神经元以要求的速率激活时,学习效率与突触前神经元数量呈相反方向变化,当使用少于两个突触前神经元时达到最大值。此外,在较低的突触前放电率下,即突触后目标频率的约数时,赫布学习更有效。本研究得出结论,当电子神经元额外模拟突触前可塑性至长时程增强时,使用较少神经元时赫布学习的效率更高。这一结果强化了我们之前研究的观察结果,即神经元数量减少的脉冲神经网络能够成功学习控制机器人手指的运动。
Biomimetics (Basel). 2023-1-11
Sensors (Basel). 2021-4-13
Front Neural Circuits. 2014-4-23
Sensors (Basel). 2020-1-16
IEEE Trans Neural Netw Learn Syst. 2025-2
Biomimetics (Basel). 2023-5-22
Biomimetics (Basel). 2022-5-13
Biomimetics (Basel). 2021-5-26
Sensors (Basel). 2021-4-13
Sensors (Basel). 2020-10-27
Front Synaptic Neurosci. 2013-10-17
PLoS Comput Biol. 2013-1-24
Curr Opin Neurobiol. 2011-2-23
Science. 2006-8-25
J Neurophysiol. 2002-7