Suppr超能文献

具有非对称连接和赫布更新的深度学习

Deep Learning With Asymmetric Connections and Hebbian Updates.

作者信息

Amit Yali

机构信息

Department of Statistics, University of Chicago, Chicago, IL, United States.

出版信息

Front Comput Neurosci. 2019 Apr 4;13:18. doi: 10.3389/fncom.2019.00018. eCollection 2019.

Abstract

We show that deep networks can be trained using Hebbian updates yielding similar performance to ordinary back-propagation on challenging image datasets. To overcome the unrealistic symmetry in connections between layers, implicit in back-propagation, the feedback weights are separate from the feedforward weights. The feedback weights are also updated with a local rule, the same as the feedforward weights-a weight is updated solely based on the product of activity of the units it connects. With fixed feedback weights as proposed in Lillicrap et al. (2016) performance degrades quickly as the depth of the network increases. If the feedforward and feedback weights are initialized with the same values, as proposed in Zipser and Rumelhart (1990), they remain the same throughout training thus precisely implementing back-propagation. We show that even when the weights are initialized differently and at random, and the algorithm is no longer performing back-propagation, performance is comparable on challenging datasets. We also propose a cost function whose derivative can be represented as a local Hebbian update on the last layer. Convolutional layers are updated with tied weights across space, which is not biologically plausible. We show that similar performance is achieved with untied layers, also known as locally connected layers, corresponding to the connectivity implied by the convolutional layers, but where weights are untied and updated separately. In the linear case we show theoretically that the convergence of the error to zero is accelerated by the update of the feedback weights.

摘要

我们表明,深度网络可以使用赫布型更新进行训练,在具有挑战性的图像数据集上产生与普通反向传播相似的性能。为了克服反向传播中隐含的层间连接不切实际的对称性,反馈权重与前馈权重是分开的。反馈权重也使用局部规则进行更新,与前馈权重相同——权重仅根据它所连接单元的活动乘积进行更新。如利利克拉普等人(2016年)所提出的,使用固定的反馈权重时,随着网络深度的增加,性能会迅速下降。如果按照齐普瑟和鲁梅尔哈特(1990年)的提议,将前馈权重和反馈权重初始化为相同的值,那么它们在整个训练过程中都保持不变,从而精确地实现反向传播。我们表明,即使权重以不同的随机方式初始化,且算法不再执行反向传播,在具有挑战性的数据集上性能也是可比的。我们还提出了一种代价函数,其导数可以表示为最后一层上的局部赫布型更新。卷积层在空间上使用绑定权重进行更新,这在生物学上是不合理的。我们表明,对于非绑定层(也称为局部连接层),可以实现类似的性能,它对应于卷积层所隐含的连接性,但权重是不绑定的且单独更新。在线性情况下,我们从理论上表明,反馈权重的更新加速了误差收敛到零的速度。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/38b6/6458299/43dc73dcdb56/fncom-13-00018-g0001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验