Suppr超能文献

通过突触智能进行持续学习。

Continual Learning Through Synaptic Intelligence.

作者信息

Zenke Friedemann, Poole Ben, Ganguli Surya

机构信息

Stanford University.

出版信息

Proc Mach Learn Res. 2017;70:3987-3995.

Abstract

While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning. In stark contrast, biological neural networks continually adapt to changing domains, possibly by leveraging complex molecular machinery to solve many tasks simultaneously. In this study, we introduce that bring some of this biological complexity into artificial neural networks. Each synapse accumulates task relevant information over time, and exploits this information to rapidly store new memories without forgetting old ones. We evaluate our approach on continual learning of classification tasks, and show that it dramatically reduces forgetting while maintaining computational efficiency.

摘要

虽然深度学习在各种应用中取得了显著进展,但在数据分布在学习过程中发生变化的领域中却面临困难。与之形成鲜明对比的是,生物神经网络能够持续适应不断变化的领域,可能是通过利用复杂的分子机制同时解决许多任务。在本研究中,我们引入了一些方法,将这种生物复杂性引入人工神经网络。每个突触会随着时间积累与任务相关的信息,并利用这些信息快速存储新记忆而不会忘记旧记忆。我们在分类任务的持续学习中评估了我们的方法,并表明它在保持计算效率的同时显著减少了遗忘。

相似文献

2
Beneficial Perturbation Network for Designing General Adaptive Artificial Intelligence Systems.有益的扰动网络设计通用自适应人工智能系统。
IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3778-3791. doi: 10.1109/TNNLS.2021.3054423. Epub 2022 Aug 3.
8
Convolutional Neural Network With Developmental Memory for Continual Learning.具有发展记忆的卷积神经网络用于连续学习。
IEEE Trans Neural Netw Learn Syst. 2021 Jun;32(6):2691-2705. doi: 10.1109/TNNLS.2020.3007548. Epub 2021 Jun 2.
9
Overcoming catastrophic forgetting in neural networks.克服神经网络中的灾难性遗忘。
Proc Natl Acad Sci U S A. 2017 Mar 28;114(13):3521-3526. doi: 10.1073/pnas.1611835114. Epub 2017 Mar 14.

引用本文的文献

本文引用的文献

1
Overcoming catastrophic forgetting in neural networks.克服神经网络中的灾难性遗忘。
Proc Natl Acad Sci U S A. 2017 Mar 28;114(13):3521-3526. doi: 10.1073/pnas.1611835114. Epub 2017 Mar 14.
2
Computational principles of synaptic memory consolidation.突触记忆巩固的计算原理。
Nat Neurosci. 2016 Dec;19(12):1697-1706. doi: 10.1038/nn.4401. Epub 2016 Oct 3.
3
Deep learning.深度学习。
Nature. 2015 May 28;521(7553):436-44. doi: 10.1038/nature14539.
5
Synaptic consolidation: from synapses to behavioral modeling.突触巩固:从突触到行为建模。
J Neurosci. 2015 Jan 21;35(3):1319-34. doi: 10.1523/JNEUROSCI.3989-14.2015.
7
Neural networks for continuous online learning and control.用于连续在线学习与控制的神经网络。
IEEE Trans Neural Netw. 2006 Nov;17(6):1511-31. doi: 10.1109/TNN.2006.881710.
8
Cascade models of synaptically stored memories.突触存储记忆的级联模型。
Neuron. 2005 Feb 17;45(4):599-611. doi: 10.1016/j.neuron.2005.02.001.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验