• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

深度学习中的再现:当前方法和缺失的生物学元素。

Replay in Deep Learning: Current Approaches and Missing Biological Elements.

机构信息

Rochester Institute of Technology, Rochester, NY 14623, U.S.A.

University of California at San Diego, La Jolla, CA 92093, U.S.A.

出版信息

Neural Comput. 2021 Oct 12;33(11):2908-2950. doi: 10.1162/neco_a_01433.

DOI:10.1162/neco_a_01433
PMID:34474476
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9074752/
Abstract

Replay is the reactivation of one or more neural patterns that are similar to the activation patterns experienced during past waking experiences. Replay was first observed in biological neural networks during sleep, and it is now thought to play a critical role in memory formation, retrieval, and consolidation. Replay-like mechanisms have been incorporated in deep artificial neural networks that learn over time to avoid catastrophic forgetting of previous knowledge. Replay algorithms have been successfully used in a wide range of deep learning methods within supervised, unsupervised, and reinforcement learning paradigms. In this letter, we provide the first comprehensive comparison between replay in the mammalian brain and replay in artificial neural networks. We identify multiple aspects of biological replay that are missing in deep learning systems and hypothesize how they could be used to improve artificial neural networks.

摘要

重放是一个或多个神经模式的重新激活,这些模式类似于在过去的清醒经历中经历的激活模式。重放最初在睡眠期间的生物神经网络中被观察到,现在被认为在记忆形成、检索和巩固中起着关键作用。类似重放的机制已经被整合到深度人工神经网络中,这些网络随着时间的推移学习,以避免对以前知识的灾难性遗忘。重放算法已成功应用于监督学习、无监督学习和强化学习范例中的各种深度学习方法中。在这封信中,我们首次对哺乳动物大脑中的重放和人工神经网络中的重放进行了全面比较。我们确定了深度学习系统中缺失的多个生物重放方面,并假设了如何利用它们来改进人工神经网络。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/817c/9074752/81ef67f6b373/nihms-1799912-f0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/817c/9074752/4222f497e6ec/nihms-1799912-f0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/817c/9074752/3a5222d11913/nihms-1799912-f0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/817c/9074752/81ef67f6b373/nihms-1799912-f0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/817c/9074752/4222f497e6ec/nihms-1799912-f0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/817c/9074752/3a5222d11913/nihms-1799912-f0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/817c/9074752/81ef67f6b373/nihms-1799912-f0003.jpg

相似文献

1
Replay in Deep Learning: Current Approaches and Missing Biological Elements.深度学习中的再现:当前方法和缺失的生物学元素。
Neural Comput. 2021 Oct 12;33(11):2908-2950. doi: 10.1162/neco_a_01433.
2
Learning offline: memory replay in biological and artificial reinforcement learning.离线学习:生物强化学习和人工强化学习中的记忆重放。
Trends Neurosci. 2021 Oct;44(10):808-821. doi: 10.1016/j.tins.2021.07.007. Epub 2021 Sep 1.
3
A neural network account of memory replay and knowledge consolidation.记忆重播与知识巩固的神经网络解释。
Cereb Cortex. 2022 Dec 15;33(1):83-95. doi: 10.1093/cercor/bhac054.
4
Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks.类睡眠无监督重放可减少人工神经网络中的灾难性遗忘。
Nat Commun. 2022 Dec 15;13(1):7742. doi: 10.1038/s41467-022-34938-7.
5
Awake replay of remote experiences in the hippocampus.海马体中对遥远经历的清醒状态下的重演。
Nat Neurosci. 2009 Jul;12(7):913-8. doi: 10.1038/nn.2344. Epub 2009 Jun 14.
6
Brain-inspired replay for continual learning with artificial neural networks.基于脑启发的人工神经网络连续学习回放。
Nat Commun. 2020 Aug 13;11(1):4069. doi: 10.1038/s41467-020-17866-2.
7
Can sleep protect memories from catastrophic forgetting?睡眠能保护记忆免受灾难性遗忘吗?
Elife. 2020 Aug 4;9:e51005. doi: 10.7554/eLife.51005.
8
Synaptic Mechanisms of Memory Consolidation during Sleep Slow Oscillations.睡眠慢波振荡期间记忆巩固的突触机制
J Neurosci. 2016 Apr 13;36(15):4231-47. doi: 10.1523/JNEUROSCI.3648-15.2016.
9
Triple-Memory Networks: A Brain-Inspired Method for Continual Learning.三记忆网络:一种受大脑启发的持续学习方法。
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):1925-1934. doi: 10.1109/TNNLS.2021.3111019. Epub 2022 May 2.
10
Awake Hippocampal-Cortical Co-reactivation Is Associated with Forgetting.清醒状态下海马-皮层的协同再激活与遗忘有关。
J Cogn Neurosci. 2023 Sep 1;35(9):1446-1462. doi: 10.1162/jocn_a_02021.

引用本文的文献

1
Interleaved Replay of Novel and Familiar Memory Traces During Slow-Wave Sleep Prevents Catastrophic Forgetting.慢波睡眠期间新记忆痕迹与熟悉记忆痕迹的交错回放可防止灾难性遗忘。
bioRxiv. 2025 Jun 29:2025.06.25.661579. doi: 10.1101/2025.06.25.661579.
2
Plasticity in inhibitory networks improves pattern separation in early olfactory processing.抑制性神经网络的可塑性可改善早期嗅觉处理中的模式分离。
Commun Biol. 2025 Apr 9;8(1):590. doi: 10.1038/s42003-025-07879-2.
3
Elements of episodic memory: insights from artificial agents.情节记忆的要素:人工智能视角下的新见解。

本文引用的文献

1
Heterogeneous Network Representation Learning: A Unified Framework with Survey and Benchmark.异构网络表示学习:一个包含综述与基准测试的统一框架
IEEE Trans Knowl Data Eng. 2022 Oct;34(10):4854-4873. doi: 10.1109/tkde.2020.3045924. Epub 2020 Dec 21.
2
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation.睡眠通过形成联合突触权重表示来防止尖峰神经网络中的灾难性遗忘。
PLoS Comput Biol. 2022 Nov 18;18(11):e1010628. doi: 10.1371/journal.pcbi.1010628. eCollection 2022 Nov.
3
Learning Structures: Predictive Representations, Replay, and Generalization.
Philos Trans R Soc Lond B Biol Sci. 2024 Nov 4;379(1913):20230416. doi: 10.1098/rstb.2023.0416. Epub 2024 Sep 16.
4
Bridging Neuroscience and AI: Environmental Enrichment as a model for forward knowledge transfer in continual learning.架起神经科学与人工智能的桥梁:环境富集作为持续学习中前瞻性知识转移的模型。
ArXiv. 2025 Jan 23:arXiv:2405.07295v3.
5
Paradoxical replay can protect contextual task representations from destructive interference when experience is unbalanced.当经验不均衡时,反常回放可以保护情境任务表征免受破坏性干扰。
bioRxiv. 2024 May 9:2024.05.09.593332. doi: 10.1101/2024.05.09.593332.
6
Plasticity in inhibitory networks improves pattern separation in early olfactory processing.抑制性网络的可塑性改善了早期嗅觉处理中的模式分离。
bioRxiv. 2025 Feb 20:2024.01.24.576675. doi: 10.1101/2024.01.24.576675.
7
Privacy-preserving continual learning methods for medical image classification: a comparative analysis.用于医学图像分类的隐私保护持续学习方法:比较分析
Front Med (Lausanne). 2023 Aug 14;10:1227515. doi: 10.3389/fmed.2023.1227515. eCollection 2023.
8
Continual Deep Learning for Time Series Modeling.用于时间序列建模的持续深度学习。
Sensors (Basel). 2023 Aug 14;23(16):7167. doi: 10.3390/s23167167.
9
Online Continual Learning in Acoustic Scene Classification: An Empirical Study.声学场景分类中的在线持续学习:一项实证研究。
Sensors (Basel). 2023 Aug 3;23(15):6893. doi: 10.3390/s23156893.
10
Neuro-inspired continual anthropomorphic grasping.受神经启发的连续拟人化抓握。
iScience. 2023 Apr 25;26(6):106735. doi: 10.1016/j.isci.2023.106735. eCollection 2023 Jun 16.
学习结构:预测性表征、回放与泛化。
Curr Opin Behav Sci. 2020 Apr;32:155-166. doi: 10.1016/j.cobeha.2020.02.017. Epub 2020 May 5.
4
Modeling the Background for Incremental and Weakly-Supervised Semantic Segmentation.为增量式和弱监督语义分割构建背景模型
IEEE Trans Pattern Anal Mach Intell. 2022 Dec;44(12):10099-10113. doi: 10.1109/TPAMI.2021.3133954. Epub 2022 Nov 7.
5
A Two-Stream Continual Learning System With Variational Domain-Agnostic Feature Replay.一种具有变分领域无关特征重放的双流持续学习系统。
IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4466-4478. doi: 10.1109/TNNLS.2021.3057453. Epub 2022 Aug 31.
6
A Continual Learning Survey: Defying Forgetting in Classification Tasks.持续学习调查:在分类任务中对抗遗忘
IEEE Trans Pattern Anal Mach Intell. 2022 Jul;44(7):3366-3385. doi: 10.1109/TPAMI.2021.3057446. Epub 2022 Jun 3.
7
A comprehensive study of class incremental learning algorithms for visual tasks.面向视觉任务的类增量学习算法的综合研究。
Neural Netw. 2021 Mar;135:38-54. doi: 10.1016/j.neunet.2020.12.003. Epub 2020 Dec 8.
8
The Tolman-Eichenbaum Machine: Unifying Space and Relational Memory through Generalization in the Hippocampal Formation.托尔曼-埃克恩鲍姆机器:通过海马结构中的泛化统一空间和关系记忆。
Cell. 2020 Nov 25;183(5):1249-1263.e23. doi: 10.1016/j.cell.2020.10.024. Epub 2020 Nov 11.
9
Bidirectional Interaction of Hippocampal Ripples and Cortical Slow Waves Leads to Coordinated Spiking Activity During NREM Sleep.海马回波和皮层慢波的双向相互作用导致非快速眼动睡眠期间协调的尖峰活动。
Cereb Cortex. 2021 Jan 1;31(1):324-340. doi: 10.1093/cercor/bhaa228.
10
Brain-inspired replay for continual learning with artificial neural networks.基于脑启发的人工神经网络连续学习回放。
Nat Commun. 2020 Aug 13;11(1):4069. doi: 10.1038/s41467-020-17866-2.