• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用噪声计算联结主义网络中的误差曲面:一种减少灾难性遗忘的新方法。

Using noise to compute error surfaces in connectionist networks: a novel means of reducing catastrophic forgetting.

作者信息

French Robert M, Chater Nick

机构信息

Quantitative Psychology and Cognitive Science, Psychology Department, University of Liège, 4000 Liège, Belgium.

出版信息

Neural Comput. 2002 Jul;14(7):1755-69. doi: 10.1162/08997660260028700.

DOI:10.1162/08997660260028700
PMID:12079555
Abstract

In error-driven distributed feedforward networks, new information typically interferes, sometimes severely, with previously learned information. We show how noise can be used to approximate the error surface of previously learned information. By combining this approximated error surface with the error surface associated with the new information to be learned, the network's retention of previously learned items can be improved and catastrophic interference significantly reduced. Further, we show that the noise-generated error surface is produced using only first-derivative information and without recourse to any explicit error information.

摘要

在错误驱动的分布式前馈网络中,新信息通常会干扰,有时甚至严重干扰先前学习到的信息。我们展示了如何利用噪声来近似先前学习到的信息的误差曲面。通过将这个近似的误差曲面与要学习的新信息相关联的误差曲面相结合,可以提高网络对先前学习项目的保留率,并显著减少灾难性干扰。此外,我们表明,噪声生成的误差曲面仅使用一阶导数信息生成,无需借助任何明确的误差信息。

相似文献

1
Using noise to compute error surfaces in connectionist networks: a novel means of reducing catastrophic forgetting.利用噪声计算联结主义网络中的误差曲面:一种减少灾难性遗忘的新方法。
Neural Comput. 2002 Jul;14(7):1755-69. doi: 10.1162/08997660260028700.
2
Are multi-layer backpropagation networks catastrophically amnesic?多层反向传播网络是否具有灾难性遗忘?
Scand J Psychol. 2004 Nov;45(5):357-61. doi: 10.1111/j.1467-9450.2004.00417.x.
3
Ensemble learning in fixed expansion layer networks for mitigating catastrophic forgetting.固定扩展层网络中的集成学习可缓解灾难性遗忘。
IEEE Trans Neural Netw Learn Syst. 2013 Oct;24(10):1623-34. doi: 10.1109/TNNLS.2013.2264952.
4
Orthogonality is not a panacea: backpropagation and "catastrophic interference".正交性并非万灵药:反向传播与“灾难性干扰”。
Scand J Psychol. 2006 Oct;47(5):339-44. doi: 10.1111/j.1467-9450.2006.00528.x.
5
The road to chaos by time-asymmetric Hebbian learning in recurrent neural networks.循环神经网络中由时间不对称赫布学习导致的混沌之路。
Neural Comput. 2007 Jan;19(1):80-110. doi: 10.1162/neco.2007.19.1.80.
6
Catastrophic forgetting in simple networks: an analysis of the pseudorehearsal solution.简单网络中的灾难性遗忘:伪排练解决方案分析
Network. 1999 Aug;10(3):227-36.
7
Competition between synaptic depression and facilitation in attractor neural networks.吸引子神经网络中突触抑制与突触易化之间的竞争。
Neural Comput. 2007 Oct;19(10):2739-55. doi: 10.1162/neco.2007.19.10.2739.
8
The loading problem for recursive neural networks.递归神经网络的负载问题。
Neural Netw. 2005 Oct;18(8):1064-79. doi: 10.1016/j.neunet.2005.07.006. Epub 2005 Sep 29.
9
Catastrophic forgetting in connectionist networks.联结主义网络中的灾难性遗忘。
Trends Cogn Sci. 1999 Apr;3(4):128-135. doi: 10.1016/s1364-6613(99)01294-2.
10
Methods for reducing interference in the Complementary Learning Systems model: oscillating inhibition and autonomous memory rehearsal.减少互补学习系统模型中干扰的方法:振荡抑制和自主记忆复述。
Neural Netw. 2005 Nov;18(9):1212-28. doi: 10.1016/j.neunet.2005.08.010. Epub 2005 Nov 2.

引用本文的文献

1
Overcoming catastrophic forgetting in neural networks.克服神经网络中的灾难性遗忘。
Proc Natl Acad Sci U S A. 2017 Mar 28;114(13):3521-3526. doi: 10.1073/pnas.1611835114. Epub 2017 Mar 14.