• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过突触智能进行持续学习。

Continual Learning Through Synaptic Intelligence.

作者信息

Zenke Friedemann, Poole Ben, Ganguli Surya

机构信息

Stanford University.

出版信息

Proc Mach Learn Res. 2017;70:3987-3995.

PMID:31909397
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6944509/
Abstract

While deep learning has led to remarkable advances across diverse applications, it struggles in domains where the data distribution changes over the course of learning. In stark contrast, biological neural networks continually adapt to changing domains, possibly by leveraging complex molecular machinery to solve many tasks simultaneously. In this study, we introduce that bring some of this biological complexity into artificial neural networks. Each synapse accumulates task relevant information over time, and exploits this information to rapidly store new memories without forgetting old ones. We evaluate our approach on continual learning of classification tasks, and show that it dramatically reduces forgetting while maintaining computational efficiency.

摘要

虽然深度学习在各种应用中取得了显著进展,但在数据分布在学习过程中发生变化的领域中却面临困难。与之形成鲜明对比的是,生物神经网络能够持续适应不断变化的领域,可能是通过利用复杂的分子机制同时解决许多任务。在本研究中,我们引入了一些方法,将这种生物复杂性引入人工神经网络。每个突触会随着时间积累与任务相关的信息,并利用这些信息快速存储新记忆而不会忘记旧记忆。我们在分类任务的持续学习中评估了我们的方法,并表明它在保持计算效率的同时显著减少了遗忘。

相似文献

1
Continual Learning Through Synaptic Intelligence.通过突触智能进行持续学习。
Proc Mach Learn Res. 2017;70:3987-3995.
2
Beneficial Perturbation Network for Designing General Adaptive Artificial Intelligence Systems.有益的扰动网络设计通用自适应人工智能系统。
IEEE Trans Neural Netw Learn Syst. 2022 Aug;33(8):3778-3791. doi: 10.1109/TNNLS.2021.3054423. Epub 2022 Aug 3.
3
Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.记忆召回:一种针对灾难性遗忘的简单神经网络训练框架。
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):2010-2022. doi: 10.1109/TNNLS.2021.3099700. Epub 2022 May 2.
4
Continual learning with attentive recurrent neural networks for temporal data classification.用于时态数据分类的基于注意力循环神经网络的持续学习
Neural Netw. 2023 Jan;158:171-187. doi: 10.1016/j.neunet.2022.10.031. Epub 2022 Nov 11.
5
Progressive learning: A deep learning framework for continual learning.渐进式学习:一种用于连续学习的深度学习框架。
Neural Netw. 2020 Aug;128:345-357. doi: 10.1016/j.neunet.2020.05.011. Epub 2020 May 18.
6
LwF-ECG: Learning-without-forgetting approach for electrocardiogram heartbeat classification based on memory with task selector.基于记忆与任务选择器的遗忘学习心电图心拍分类方法
Comput Biol Med. 2021 Oct;137:104807. doi: 10.1016/j.compbiomed.2021.104807. Epub 2021 Aug 27.
7
Schematic memory persistence and transience for efficient and robust continual learning.用于高效和鲁棒的持续学习的图式记忆持久性和瞬态性。
Neural Netw. 2021 Dec;144:49-60. doi: 10.1016/j.neunet.2021.08.011. Epub 2021 Aug 13.
8
Convolutional Neural Network With Developmental Memory for Continual Learning.具有发展记忆的卷积神经网络用于连续学习。
IEEE Trans Neural Netw Learn Syst. 2021 Jun;32(6):2691-2705. doi: 10.1109/TNNLS.2020.3007548. Epub 2021 Jun 2.
9
Overcoming catastrophic forgetting in neural networks.克服神经网络中的灾难性遗忘。
Proc Natl Acad Sci U S A. 2017 Mar 28;114(13):3521-3526. doi: 10.1073/pnas.1611835114. Epub 2017 Mar 14.
10
Improving transparency and representational generalizability through parallel continual learning.通过并行持续学习提高透明度和代表性泛化能力。
Neural Netw. 2023 Apr;161:449-465. doi: 10.1016/j.neunet.2023.02.007. Epub 2023 Feb 10.

引用本文的文献

1
Trajectory Tracking Controller for Quadrotor by Continual Reinforcement Learning in Wind-Disturbed Environment.风力干扰环境下基于持续强化学习的四旋翼轨迹跟踪控制器
Sensors (Basel). 2025 Aug 8;25(16):4895. doi: 10.3390/s25164895.
2
Dual-Stage Clean-Sample Selection for Incremental Noisy Label Learning.用于增量噪声标签学习的双阶段干净样本选择
Bioengineering (Basel). 2025 Jul 8;12(7):743. doi: 10.3390/bioengineering12070743.
3
Interleaved Replay of Novel and Familiar Memory Traces During Slow-Wave Sleep Prevents Catastrophic Forgetting.慢波睡眠期间新记忆痕迹与熟悉记忆痕迹的交错回放可防止灾难性遗忘。
bioRxiv. 2025 Jun 29:2025.06.25.661579. doi: 10.1101/2025.06.25.661579.
4
Domain-incremental white blood cell classification with privacy-aware continual learning.具有隐私感知持续学习的域增量白细胞分类
Sci Rep. 2025 Jul 15;15(1):25468. doi: 10.1038/s41598-025-08024-z.
5
Assemblies, synapse clustering, and network topology interact with plasticity to explain structure-function relationships of the cortical connectome.组件、突触聚类和网络拓扑与可塑性相互作用,以解释皮质连接组的结构-功能关系。
Elife. 2025 Jul 3;13:RP101850. doi: 10.7554/eLife.101850.
6
Continual learning across population cohorts with distribution shift: insights from multi-cohort metabolic syndrome identification.跨分布变化的人群队列持续学习:多队列代谢综合征识别的见解
J Am Med Inform Assoc. 2025 Jun 11. doi: 10.1093/jamia/ocaf070.
7
Domain-Adaptive Continual Meta-Learning for Modeling Dynamical Systems: An Application in Environmental Ecosystems.用于动态系统建模的域自适应连续元学习:在环境生态系统中的应用
Proc SIAM Int Conf Data Min. 2025;2025:297-306. doi: 10.1137/1.9781611978520.29.
8
Multi-scene image fusion via memory aware synapses.通过记忆感知突触实现多场景图像融合。
Sci Rep. 2025 Apr 24;15(1):14280. doi: 10.1038/s41598-025-88261-4.
9
Input-driven dynamics for robust memory retrieval in Hopfield networks.霍普菲尔德网络中用于稳健记忆检索的输入驱动动力学。
Sci Adv. 2025 Apr 25;11(17):eadu6991. doi: 10.1126/sciadv.adu6991. Epub 2025 Apr 23.
10
Exploring multi-granularity balance strategy for class incremental learning via three-way granular computing.基于三支粒计算探索类增量学习的多粒度平衡策略
Brain Inform. 2025 Mar 17;12(1):7. doi: 10.1186/s40708-025-00255-0.

本文引用的文献

1
Overcoming catastrophic forgetting in neural networks.克服神经网络中的灾难性遗忘。
Proc Natl Acad Sci U S A. 2017 Mar 28;114(13):3521-3526. doi: 10.1073/pnas.1611835114. Epub 2017 Mar 14.
2
Computational principles of synaptic memory consolidation.突触记忆巩固的计算原理。
Nat Neurosci. 2016 Dec;19(12):1697-1706. doi: 10.1038/nn.4401. Epub 2016 Oct 3.
3
Deep learning.深度学习。
Nature. 2015 May 28;521(7553):436-44. doi: 10.1038/nature14539.
4
Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks.多种突触可塑性机制协同作用,在脉冲神经网络中形成和检索记忆。
Nat Commun. 2015 Apr 21;6:6922. doi: 10.1038/ncomms7922.
5
Synaptic consolidation: from synapses to behavioral modeling.突触巩固:从突触到行为建模。
J Neurosci. 2015 Jan 21;35(3):1319-34. doi: 10.1523/JNEUROSCI.3989-14.2015.
6
Making memories last: the synaptic tagging and capture hypothesis.让记忆持久:突触标记和捕获假说。
Nat Rev Neurosci. 2011 Jan;12(1):17-30. doi: 10.1038/nrn2963.
7
Neural networks for continuous online learning and control.用于连续在线学习与控制的神经网络。
IEEE Trans Neural Netw. 2006 Nov;17(6):1511-31. doi: 10.1109/TNN.2006.881710.
8
Cascade models of synaptically stored memories.突触存储记忆的级联模型。
Neuron. 2005 Feb 17;45(4):599-611. doi: 10.1016/j.neuron.2005.02.001.
9
The perceptron: a probabilistic model for information storage and organization in the brain.感知器:大脑中信息存储与组织的概率模型。
Psychol Rev. 1958 Nov;65(6):386-408. doi: 10.1037/h0042519.
10
Reversal and stabilization of synaptic modifications in a developing visual system.发育中的视觉系统中突触修饰的逆转与稳定
Science. 2003 Jun 20;300(5627):1953-7. doi: 10.1126/science.1082212.