Suppr超能文献

克服神经网络中的灾难性遗忘。

Overcoming catastrophic forgetting in neural networks.

机构信息

DeepMind, London EC4 5TW, United Kingdom;

DeepMind, London EC4 5TW, United Kingdom.

出版信息

Proc Natl Acad Sci U S A. 2017 Mar 28;114(13):3521-3526. doi: 10.1073/pnas.1611835114. Epub 2017 Mar 14.

Abstract

The ability to learn tasks in a sequential fashion is crucial to the development of artificial intelligence. Until now neural networks have not been capable of this and it has been widely thought that catastrophic forgetting is an inevitable feature of connectionist models. We show that it is possible to overcome this limitation and train networks that can maintain expertise on tasks that they have not experienced for a long time. Our approach remembers old tasks by selectively slowing down learning on the weights important for those tasks. We demonstrate our approach is scalable and effective by solving a set of classification tasks based on a hand-written digit dataset and by learning several Atari 2600 games sequentially.

摘要

以序列方式学习任务的能力对人工智能的发展至关重要。到目前为止,神经网络还无法做到这一点,人们普遍认为灾难性遗忘是连接主义模型的一个必然特征。我们表明,克服这一限制并训练能够长时间保持对其未经验证的任务的专业知识的网络是可能的。我们的方法通过选择性地减缓对那些任务重要的权重的学习来记住旧任务。我们通过解决一组基于手写数字数据集的分类任务并顺序学习多个雅达利 2600 游戏来证明我们的方法是可扩展和有效的。

相似文献

1
Overcoming catastrophic forgetting in neural networks.克服神经网络中的灾难性遗忘。
Proc Natl Acad Sci U S A. 2017 Mar 28;114(13):3521-3526. doi: 10.1073/pnas.1611835114. Epub 2017 Mar 14.
4
Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.利用上下文相关门控和突触稳定缓解灾难性遗忘。
Proc Natl Acad Sci U S A. 2018 Oct 30;115(44):E10467-E10475. doi: 10.1073/pnas.1803839115. Epub 2018 Oct 12.
7
Comparing continual task learning in minds and machines.比较心智和机器中的持续任务学习。
Proc Natl Acad Sci U S A. 2018 Oct 30;115(44):E10313-E10322. doi: 10.1073/pnas.1800755115. Epub 2018 Oct 15.
8
Contributions by metaplasticity to solving the Catastrophic Forgetting Problem.介形亚纲对解决灾难性遗忘问题的贡献。
Trends Neurosci. 2022 Sep;45(9):656-666. doi: 10.1016/j.tins.2022.06.002. Epub 2022 Jul 4.
9
Online continual learning with declarative memory.在线使用声明式记忆进行持续学习。
Neural Netw. 2023 Jun;163:146-155. doi: 10.1016/j.neunet.2023.03.025. Epub 2023 Mar 27.
10
Incremental Concept Learning via Online Generative Memory Recall.通过在线生成记忆回忆进行增量概念学习。
IEEE Trans Neural Netw Learn Syst. 2021 Jul;32(7):3206-3216. doi: 10.1109/TNNLS.2020.3010581. Epub 2021 Jul 6.

引用本文的文献

本文引用的文献

3
Deep learning.深度学习。
Nature. 2015 May 28;521(7553):436-44. doi: 10.1038/nature14539.
10

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验