IBM Research - Zurich, Rüschlikon, Switzerland.
Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland.
Nat Commun. 2022 Apr 7;13(1):1885. doi: 10.1038/s41467-022-29491-2.
Plasticity circuits in the brain are known to be influenced by the distribution of the synaptic weights through the mechanisms of synaptic integration and local regulation of synaptic strength. However, the complex interplay of stimulation-dependent plasticity with local learning signals is disregarded by most of the artificial neural network training algorithms devised so far. Here, we propose a novel biologically inspired optimizer for artificial and spiking neural networks that incorporates key principles of synaptic plasticity observed in cortical dendrites: GRAPES (Group Responsibility for Adjusting the Propagation of Error Signals). GRAPES implements a weight-distribution-dependent modulation of the error signal at each node of the network. We show that this biologically inspired mechanism leads to a substantial improvement of the performance of artificial and spiking networks with feedforward, convolutional, and recurrent architectures, it mitigates catastrophic forgetting, and it is optimally suited for dedicated hardware implementations. Overall, our work indicates that reconciling neurophysiology insights with machine intelligence is key to boosting the performance of neural networks.
大脑中的可塑性电路已知受到突触权重分布的影响,通过突触整合和局部调节突触强度的机制。然而,迄今为止设计的大多数人工神经网络训练算法都忽略了刺激依赖性可塑性与局部学习信号的复杂相互作用。在这里,我们提出了一种新的受生物启发的人工和尖峰神经网络优化器,该优化器结合了皮质树突中观察到的突触可塑性的关键原则:GRAPES(用于调整误差信号传播的组责任)。GRAPES 在网络的每个节点实现了对误差信号的与权重分布相关的调制。我们表明,这种受生物启发的机制导致前馈、卷积和递归架构的人工和尖峰网络的性能得到了显著提高,它减轻了灾难性遗忘,并且非常适合专用硬件实现。总的来说,我们的工作表明,将神经生理学的见解与机器智能相结合是提高神经网络性能的关键。