Suppr超能文献

学过不忘。

Learning without Forgetting.

作者信息

Li Zhizhong, Hoiem Derek

出版信息

IEEE Trans Pattern Anal Mach Intell. 2018 Dec;40(12):2935-2947. doi: 10.1109/TPAMI.2017.2773081. Epub 2017 Nov 14.

Abstract

When building a unified vision system or gradually adding new apabilities to a system, the usual assumption is that training data for all tasks is always available. However, as the number of tasks grows, storing and retraining on such data becomes infeasible. A new problem arises where we add new capabilities to a Convolutional Neural Network (CNN), but the training data for its existing capabilities are unavailable. We propose our Learning without Forgetting method, which uses only new task data to train the network while preserving the original capabilities. Our method performs favorably compared to commonly used feature extraction and fine-tuning adaption techniques and performs similarly to multitask learning that uses original task data we assume unavailable. A more surprising observation is that Learning without Forgetting may be able to replace fine-tuning with similar old and new task datasets for improved new task performance.

摘要

在构建统一视觉系统或逐步为系统添加新功能时,通常的假设是所有任务的训练数据始终可用。然而,随着任务数量的增加,存储此类数据并在其上进行重新训练变得不可行。当我们向卷积神经网络(CNN)添加新功能,但现有功能的训练数据不可用时,就会出现一个新问题。我们提出了“不忘学习”方法,该方法仅使用新任务数据来训练网络,同时保留原始功能。与常用的特征提取和微调适应技术相比,我们的方法表现良好,并且与使用我们假设不可用的原始任务数据的多任务学习表现相似。一个更令人惊讶的发现是,“不忘学习”可能能够用类似的新旧任务数据集替代微调,以提高新任务性能。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验