Graduate School of Frontier Sciences, The University of Tokyo, Kashiwa, Chiba 277-8561, Japan.
Neural Netw. 2011 Apr;24(3):267-72. doi: 10.1016/j.neunet.2010.12.001. Epub 2010 Dec 9.
Node perturbation learning is a stochastic gradient descent method for neural networks. It estimates the gradient by comparing an evaluation of the perturbed output and the unperturbed output performance, which we call the baseline. Node perturbation learning has primarily been investigated without taking noise on the baseline into consideration. In real biological systems, however, neural activities are intrinsically noisy, and hence, the baseline is likely contaminated with the noise. In this paper, we propose an alternative learning method that does not require such a noiseless baseline. Our method uses a "second perturbation", which is calculated with different noise than the first perturbation. By comparing the evaluation of the outcomes with the first perturbation and with the second perturbation, the network weights are updated. We reveal that the learning speed showed only a linear decrease with the variance of the second perturbation. Moreover, using the second perturbation can lead to a decrease in residual error compared to the case of using the noiseless baseline.
节点扰动学习是一种用于神经网络的随机梯度下降方法。它通过比较扰动输出和未扰动输出性能的评估来估计梯度,我们称之为基线。然而,节点扰动学习主要是在不考虑基线噪声的情况下进行研究的。然而,在实际的生物系统中,神经活动是固有噪声的,因此,基线很可能受到噪声的污染。在本文中,我们提出了一种不需要无噪声基线的替代学习方法。我们的方法使用“二次扰动”,它是用不同于第一次扰动的噪声计算的。通过比较结果的评估与第一次扰动和第二次扰动,更新网络权重。我们揭示了学习速度仅随第二次扰动方差的线性下降。此外,与使用无噪声基线的情况相比,使用二次扰动可以导致残差的减少。