Department of Physiology, University of Bern, Bern, Switzerland.
Kirchhoff-Institute for Physics, Heidelberg University, Heidelberg, Germany.
Elife. 2022 Apr 25;11:e66526. doi: 10.7554/eLife.66526.
In many normative theories of synaptic plasticity, weight updates implicitly depend on the chosen parametrization of the weights. This problem relates, for example, to neuronal morphology: synapses which are functionally equivalent in terms of their impact on somatic firing can differ substantially in spine size due to their different positions along the dendritic tree. Classical theories based on Euclidean-gradient descent can easily lead to inconsistencies due to such parametrization dependence. The issues are solved in the framework of Riemannian geometry, in which we propose that plasticity instead follows natural-gradient descent. Under this hypothesis, we derive a synaptic learning rule for spiking neurons that couples functional efficiency with the explanation of several well-documented biological phenomena such as dendritic democracy, multiplicative scaling, and heterosynaptic plasticity. We therefore suggest that in its search for functional synaptic plasticity, evolution might have come up with its own version of natural-gradient descent.
在许多突触可塑性的规范理论中,权重更新隐含地依赖于权重的选择参数化。这个问题与神经元形态有关:由于它们在树突上的位置不同,在影响体细胞核发放方面功能等效的突触在棘突大小上可能有很大差异。基于欧几里得梯度下降的经典理论很容易由于这种参数化依赖性而导致不一致。在黎曼几何的框架内解决了这些问题,在该框架中,我们提出可塑性应该遵循自然梯度下降。在这个假设下,我们为尖峰神经元推导出一个突触学习规则,该规则将功能效率与几种有充分文献记录的生物学现象(如树突民主、乘法缩放和异突触可塑性)联系起来。因此,我们认为,在寻找功能突触可塑性的过程中,进化可能已经提出了自己的自然梯度下降版本。