CEMS, School of Computing, University of Kent, CT2 7NF, Canterbury, UK.
CEMS, School of Computing, University of Kent, CT2 7NF, Canterbury, UK.
Neural Netw. 2021 Mar;135:192-200. doi: 10.1016/j.neunet.2020.12.012. Epub 2021 Jan 2.
We analyse mathematically the constraints on weights resulting from Hebbian and STDP learning rules applied to a spiking neuron with weight normalisation. In the case of pure Hebbian learning, we find that the normalised weights equal the promotion probabilities of weights up to correction terms that depend on the learning rate and are usually small. A similar relation can be derived for STDP algorithms, where the normalised weight values reflect a difference between the promotion and demotion probabilities of the weight. These relations are practically useful in that they allow checking for convergence of Hebbian and STDP algorithms. Another application is novelty detection. We demonstrate this using the MNIST dataset.
我们分析了应用于具有权重归一化的尖峰神经元的赫布和 STDP 学习规则所产生的权重约束。在纯赫布学习的情况下,我们发现归一化权重等于权重的促进概率,其修正项取决于学习率且通常很小。对于 STDP 算法,也可以推导出类似的关系,其中归一化权重值反映了权重促进和抑制概率之间的差异。这些关系在实践中非常有用,因为它们可以检查赫布和 STDP 算法的收敛性。另一个应用是新颖性检测。我们使用 MNIST 数据集对此进行了演示。