Suppr超能文献

多层反向传播网络是否具有灾难性遗忘?

Are multi-layer backpropagation networks catastrophically amnesic?

作者信息

Yamaguchi Makoto

机构信息

Waseda University, Tokyo 169-8050, Japan.

出版信息

Scand J Psychol. 2004 Nov;45(5):357-61. doi: 10.1111/j.1467-9450.2004.00417.x.

Abstract

Connectionist models with a backpropagation learning rule are known to have a serious problem. Such models exhibit catastrophic interference (or forgetting) with sequential training. Having learned a set of patterns, if the model is trained on another set of patterns, its performance on the first set can dramatically deteriorate very rapidly. The present study reconsiders this issue with four simulations. The model learned arithmetic facts sequentially, but the interference was only modest with random (hence approximately orthogonal) inputs. Essentially the same result was obtained when the inputs are made less orthogonal by adding irrelevant elements. Reducing the number of hidden units did not have major effects. This study suggests that the interference problem has been somewhat overstated.

摘要

具有反向传播学习规则的联结主义模型存在一个严重问题。这类模型在序列训练中会表现出灾难性干扰(或遗忘)。在学习了一组模式后,如果该模型再接受另一组模式的训练,那么它在第一组模式上的表现会迅速大幅下降。本研究通过四项模拟重新审视了这个问题。该模型按顺序学习算术事实,但对于随机(因此近似正交)输入,干扰只是适度的。当通过添加无关元素使输入的正交性降低时,也得到了基本相同的结果。减少隐藏单元的数量并没有产生重大影响。这项研究表明,干扰问题可能被有些夸大了。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验