Klos Christian, Memmesheimer Raoul-Martin
University of Bonn, Neural Network Dynamics and Computation, Institute of Genetics, 53115 Bonn, Germany.
Phys Rev Lett. 2025 Jan 17;134(2):027301. doi: 10.1103/PhysRevLett.134.027301.
Gradient descent prevails in artificial neural network training, but seems inept for spiking neural networks as small parameter changes can cause sudden, disruptive appearances and disappearances of spikes. Here, we demonstrate exact gradient descent based on continuously changing spiking dynamics. These are generated by neuron models whose spikes vanish and appear at the end of a trial, where it cannot influence subsequent dynamics. This also enables gradient-based spike addition and removal. We illustrate our scheme with various tasks and setups, including recurrent and deep, initially silent networks.
梯度下降在人工神经网络训练中占主导地位,但对于脉冲神经网络似乎并不适用,因为微小的参数变化可能会导致脉冲突然出现和消失,从而产生干扰。在这里,我们展示了基于不断变化的脉冲动力学的精确梯度下降。这些动力学是由神经元模型生成的,其脉冲在试验结束时消失和出现,在此处它不会影响后续动力学。这也使得基于梯度的脉冲添加和移除成为可能。我们用各种任务和设置说明了我们的方案,包括循环和深度的、最初沉默的网络。