IEEE Trans Cybern. 2022 Oct;52(10):11254-11266. doi: 10.1109/TCYB.2021.3060466. Epub 2022 Sep 19.
Most existing studies on computational modeling of neural plasticity have focused on synaptic plasticity. However, regulation of the internal weights in the reservoir based on synaptic plasticity often results in unstable learning dynamics. In this article, a structural synaptic plasticity learning rule is proposed to train the weights and add or remove neurons within the reservoir, which is shown to be able to alleviate the instability of the synaptic plasticity, and to contribute to increase the memory capacity of the network as well. Our experimental results also reveal that a few stronger connections may last for a longer period of time in a constantly changing network structure, and are relatively resistant to decay or disruptions in the learning process. These results are consistent with the evidence observed in biological systems. Finally, we show that an echo state network (ESN) using the proposed structural plasticity rule outperforms an ESN using synaptic plasticity and three state-of-the-art ESNs on four benchmark tasks.
大多数现有的神经可塑性计算模型研究都集中在突触可塑性上。然而,基于突触可塑性的储层内部权重调节往往会导致不稳定的学习动态。在本文中,提出了一种结构突触可塑性学习规则,用于训练储层中的权重和添加或移除神经元,该规则可缓解突触可塑性的不稳定性,并有助于提高网络的记忆容量。我们的实验结果还表明,在不断变化的网络结构中,少数较强的连接可能会持续更长时间,并且在学习过程中相对不易衰减或中断。这些结果与生物系统中观察到的证据一致。最后,我们展示了使用所提出的结构可塑性规则的回声状态网络(ESN)在四个基准任务上优于使用突触可塑性的 ESN 和三个最先进的 ESN。