• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

二值化神经网络中的突触型变异性。

Synaptic metaplasticity in binarized neural networks.

机构信息

Université Paris-Saclay, CNRS, Centre de Nanosciences et de Nanotechnologies, Palaiseau, France.

Unité Mixte de Physique, CNRS, Thales, Université Paris-Saclay, Palaiseau, France.

出版信息

Nat Commun. 2021 May 5;12(1):2549. doi: 10.1038/s41467-021-22768-y.

DOI:10.1038/s41467-021-22768-y
PMID:33953183
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8100137/
Abstract

While deep neural networks have surpassed human performance in multiple situations, they are prone to catastrophic forgetting: upon training a new task, they rapidly forget previously learned ones. Neuroscience studies, based on idealized tasks, suggest that in the brain, synapses overcome this issue by adjusting their plasticity depending on their past history. However, such "metaplastic" behaviors do not transfer directly to mitigate catastrophic forgetting in deep neural networks. In this work, we interpret the hidden weights used by binarized neural networks, a low-precision version of deep neural networks, as metaplastic variables, and modify their training technique to alleviate forgetting. Building on this idea, we propose and demonstrate experimentally, in situations of multitask and stream learning, a training technique that reduces catastrophic forgetting without needing previously presented data, nor formal boundaries between datasets and with performance approaching more mainstream techniques with task boundaries. We support our approach with a theoretical analysis on a tractable task. This work bridges computational neuroscience and deep learning, and presents significant assets for future embedded and neuromorphic systems, especially when using novel nanodevices featuring physics analogous to metaplasticity.

摘要

虽然深度神经网络在多种情况下已经超越了人类的表现,但它们容易出现灾难性遗忘:在训练新任务时,它们会迅速忘记以前学习过的任务。神经科学研究基于理想化的任务表明,在大脑中,突触通过根据过去的历史调整其可塑性来克服这个问题。然而,这种“类重塑性”行为并不能直接转移到缓解深度神经网络中的灾难性遗忘。在这项工作中,我们将二进制神经网络(深度神经网络的低精度版本)使用的隐藏权重解释为类重塑性变量,并修改它们的训练技术以减轻遗忘。基于这个想法,我们提出并在多任务和流学习的情况下进行了实验验证,提出了一种在不需要以前呈现的数据、也不需要数据集之间的正式界限的情况下减轻灾难性遗忘的训练技术,并且其性能接近具有任务界限的更主流技术。我们在一个可处理的任务上进行了理论分析来支持我们的方法。这项工作连接了计算神经科学和深度学习,并为未来的嵌入式和神经形态系统提供了重要的资产,特别是在使用具有类似于类重塑性的物理特性的新型纳米器件时。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/4efe612f3f01/41467_2021_22768_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/196aa7e8f7d0/41467_2021_22768_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/8821d917211a/41467_2021_22768_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/c4b384cf1523/41467_2021_22768_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/c14b678151ad/41467_2021_22768_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/43b19b4105c8/41467_2021_22768_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/4efe612f3f01/41467_2021_22768_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/196aa7e8f7d0/41467_2021_22768_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/8821d917211a/41467_2021_22768_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/c4b384cf1523/41467_2021_22768_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/c14b678151ad/41467_2021_22768_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/43b19b4105c8/41467_2021_22768_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6838/8100137/4efe612f3f01/41467_2021_22768_Fig6_HTML.jpg

相似文献

1
Synaptic metaplasticity in binarized neural networks.二值化神经网络中的突触型变异性。
Nat Commun. 2021 May 5;12(1):2549. doi: 10.1038/s41467-021-22768-y.
2
Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization.利用上下文相关门控和突触稳定缓解灾难性遗忘。
Proc Natl Acad Sci U S A. 2018 Oct 30;115(44):E10467-E10475. doi: 10.1073/pnas.1803839115. Epub 2018 Oct 12.
3
Overcoming Long-Term Catastrophic Forgetting Through Adversarial Neural Pruning and Synaptic Consolidation.通过对抗性神经修剪和突触巩固来克服长期灾难性遗忘。
IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4243-4256. doi: 10.1109/TNNLS.2021.3056201. Epub 2022 Aug 31.
4
Contributions by metaplasticity to solving the Catastrophic Forgetting Problem.介形亚纲对解决灾难性遗忘问题的贡献。
Trends Neurosci. 2022 Sep;45(9):656-666. doi: 10.1016/j.tins.2022.06.002. Epub 2022 Jul 4.
5
Sleep prevents catastrophic forgetting in spiking neural networks by forming a joint synaptic weight representation.睡眠通过形成联合突触权重表示来防止尖峰神经网络中的灾难性遗忘。
PLoS Comput Biol. 2022 Nov 18;18(11):e1010628. doi: 10.1371/journal.pcbi.1010628. eCollection 2022 Nov.
6
Sleep-like unsupervised replay reduces catastrophic forgetting in artificial neural networks.类睡眠无监督重放可减少人工神经网络中的灾难性遗忘。
Nat Commun. 2022 Dec 15;13(1):7742. doi: 10.1038/s41467-022-34938-7.
7
Memory Recall: A Simple Neural Network Training Framework Against Catastrophic Forgetting.记忆召回:一种针对灾难性遗忘的简单神经网络训练框架。
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):2010-2022. doi: 10.1109/TNNLS.2021.3099700. Epub 2022 May 2.
8
Continual Learning Using Bayesian Neural Networks.贝叶斯神经网络的持续学习。
IEEE Trans Neural Netw Learn Syst. 2021 Sep;32(9):4243-4252. doi: 10.1109/TNNLS.2020.3017292. Epub 2021 Aug 31.
9
Ensemble learning in fixed expansion layer networks for mitigating catastrophic forgetting.固定扩展层网络中的集成学习可缓解灾难性遗忘。
IEEE Trans Neural Netw Learn Syst. 2013 Oct;24(10):1623-34. doi: 10.1109/TNNLS.2013.2264952.
10
LwF-ECG: Learning-without-forgetting approach for electrocardiogram heartbeat classification based on memory with task selector.基于记忆与任务选择器的遗忘学习心电图心拍分类方法
Comput Biol Med. 2021 Oct;137:104807. doi: 10.1016/j.compbiomed.2021.104807. Epub 2021 Aug 27.

引用本文的文献

1
Anterograde interference in multitask perceptual learning.多任务知觉学习中的顺行性干扰。
NPJ Sci Learn. 2025 May 9;10(1):23. doi: 10.1038/s41539-025-00312-7.
2
Electrochemical ohmic memristors for continual learning.用于持续学习的电化学欧姆忆阻器。
Nat Commun. 2025 Mar 8;16(1):2348. doi: 10.1038/s41467-025-57543-w.
3
Hybrid neural networks for continual learning inspired by corticohippocampal circuits.受皮质-海马回路启发的用于持续学习的混合神经网络。

本文引用的文献

1
A solution to the learning dilemma for recurrent networks of spiking neurons.用于尖峰神经元递归网络的学习困境的解决方案。
Nat Commun. 2020 Jul 17;11(1):3625. doi: 10.1038/s41467-020-17236-y.
2
Big data needs a hardware revolution.大数据需要一场硬件革命。
Nature. 2018 Feb;554(7691):145-146. doi: 10.1038/d41586-018-01683-1.
3
Digital Biologically Plausible Implementation of Binarized Neural Networks With Differential Hafnium Oxide Resistive Memory Arrays.基于氧化铪差分电阻式存储器阵列的二值化神经网络的数字生物合理实现
Nat Commun. 2025 Feb 2;16(1):1272. doi: 10.1038/s41467-025-56405-9.
4
Neuromorphic neuromodulation: Towards the next generation of closed-loop neurostimulation.神经形态神经调节:迈向新一代闭环神经刺激。
PNAS Nexus. 2024 Oct 30;3(11):pgae488. doi: 10.1093/pnasnexus/pgae488. eCollection 2024 Nov.
5
Eight challenges in developing theory of intelligence.发展智力理论的八大挑战。
Front Comput Neurosci. 2024 Jul 24;18:1388166. doi: 10.3389/fncom.2024.1388166. eCollection 2024.
6
Bio-inspired, task-free continual learning through activity regularization.受生物启发的、无需任务的通过活动正则化的持续学习。
Biol Cybern. 2023 Oct;117(4-5):345-361. doi: 10.1007/s00422-023-00973-w. Epub 2023 Aug 17.
7
Distinctive properties of biological neural networks and recent advances in bottom-up approaches toward a better biologically plausible neural network.生物神经网络的独特特性以及自下而上构建更具生物学合理性神经网络方法的最新进展。
Front Comput Neurosci. 2023 Jun 28;17:1092185. doi: 10.3389/fncom.2023.1092185. eCollection 2023.
8
On-device synaptic memory consolidation using Fowler-Nordheim quantum-tunneling.利用福勒-诺德海姆量子隧穿在设备上进行突触记忆巩固。
Front Neurosci. 2023 Jan 13;16:1050585. doi: 10.3389/fnins.2022.1050585. eCollection 2022.
9
Bayesian continual learning spiking neural networks.贝叶斯持续学习脉冲神经网络。
Front Comput Neurosci. 2022 Nov 16;16:1037976. doi: 10.3389/fncom.2022.1037976. eCollection 2022.
10
Metaplastic and energy-efficient biocompatible graphene artificial synaptic transistors for enhanced accuracy neuromorphic computing.用于提高准确性神经形态计算的具有代谢和节能特性的生物相容性石墨烯人工突触晶体管
Nat Commun. 2022 Jul 28;13(1):4386. doi: 10.1038/s41467-022-32078-6.
Front Neurosci. 2020 Jan 9;13:1383. doi: 10.3389/fnins.2019.01383. eCollection 2019.
4
Continual Learning Through Synaptic Intelligence.通过突触智能进行持续学习。
Proc Mach Learn Res. 2017;70:3987-3995.
5
A deep learning framework for neuroscience.深度学习在神经科学中的应用框架。
Nat Neurosci. 2019 Nov;22(11):1761-1770. doi: 10.1038/s41593-019-0520-2. Epub 2019 Oct 28.
6
Vowel recognition with four coupled spin-torque nano-oscillators.利用四个耦合自旋扭矩纳米振荡器进行元音识别。
Nature. 2018 Nov;563(7730):230-234. doi: 10.1038/s41586-018-0632-y. Epub 2018 Oct 29.
7
Learning without Forgetting.学过不忘。
IEEE Trans Pattern Anal Mach Intell. 2018 Dec;40(12):2935-2947. doi: 10.1109/TPAMI.2017.2773081. Epub 2017 Nov 14.
8
Synaptic Plasticity and Metaplasticity of Biological Synapse Realized in a KNbO Memristor for Application to Artificial Synapse.在用于人工突触的 KNbO 忆阻器中实现生物突触的突触可塑性和代谢可塑性。
ACS Appl Mater Interfaces. 2018 Aug 1;10(30):25673-25682. doi: 10.1021/acsami.8b04550. Epub 2018 Jul 19.
9
Equivalent-accuracy accelerated neural-network training using analogue memory.利用模拟内存实现等效精度的加速神经网络训练。
Nature. 2018 Jun;558(7708):60-67. doi: 10.1038/s41586-018-0180-5. Epub 2018 Jun 6.
10
Programmable Synaptic Metaplasticity and below Femtojoule Spiking Energy Realized in Graphene-Based Neuromorphic Memristor.基于石墨烯的神经形态忆阻器中实现的可编程突触形变更迭和亚飞焦耳级尖峰能量。
ACS Appl Mater Interfaces. 2018 Jun 20;10(24):20237-20243. doi: 10.1021/acsami.8b04685. Epub 2018 Jun 11.