Norman Kenneth A, Newman Ehren, Detre Greg, Polyn Sean
Department of Psychology, Princeton University, NJ 08544, USA.
Neural Comput. 2006 Jul;18(7):1577-610. doi: 10.1162/neco.2006.18.7.1577.
We present a new learning algorithm that leverages oscillations in the strength of neural inhibition to train neural networks. Raising inhibition can be used to identify weak parts of target memories, which are then strengthened. Conversely, lowering inhibition can be used to identify competitors, which are then weakened. To update weights, we apply the Contrastive Hebbian Learning equation to successive time steps of the network. The sign of the weight change equation varies as a function of the phase of the inhibitory oscillation. We show that the learning algorithm can memorize large numbers of correlated input patterns without collapsing and that it shows good generalization to test patterns that do not exactly match studied patterns.
我们提出了一种新的学习算法,该算法利用神经抑制强度的振荡来训练神经网络。提高抑制作用可用于识别目标记忆中的薄弱部分,随后对其进行强化。相反,降低抑制作用可用于识别竞争部分,随后对其进行削弱。为了更新权重,我们将对比赫布学习方程应用于网络的连续时间步长。权重变化方程的符号随抑制振荡的相位而变化。我们表明,该学习算法能够记忆大量相关的输入模式而不会崩溃,并且对与学习模式不完全匹配的测试模式具有良好的泛化能力。