• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于相似度的上下文感知持续学习的脉冲神经网络

Similarity-based context aware continual learning for spiking neural networks.

作者信息

Han Bing, Zhao Feifei, Li Yang, Kong Qingqun, Li Xianqi, Zeng Yi

机构信息

Brain-inspired Cognitive Intelligence Lab, Institute of Automation, Chinese Academy of Sciences, Beijing, China; School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China.

Brain-inspired Cognitive Intelligence Lab, Institute of Automation, Chinese Academy of Sciences, Beijing, China.

出版信息

Neural Netw. 2025 Apr;184:107037. doi: 10.1016/j.neunet.2024.107037. Epub 2024 Dec 12.

DOI:10.1016/j.neunet.2024.107037
PMID:39708703
Abstract

Biological brains have the capability to adaptively coordinate relevant neuronal populations based on the task context to learn continuously changing tasks in real-world environments. However, existing spiking neural network-based continual learning algorithms treat each task equally, ignoring the guiding role of different task similarity associations for network learning, which limits knowledge utilization efficiency. Inspired by the context-dependent plasticity mechanism of the brain, we propose a Similarity-based Context Aware Spiking Neural Network (SCA-SNN) continual learning algorithm to efficiently accomplish task incremental learning and class incremental learning. Based on contextual similarity across tasks, the SCA-SNN model can adaptively reuse neurons from previous tasks that are beneficial for new tasks (the more similar, the more neurons are reused) and flexibly expand new neurons for the new task (the more similar, the fewer neurons are expanded). Selective reuse and discriminative expansion significantly improve the utilization of previous knowledge and reduce energy consumption. Extensive experimental results on CIFAR100, ImageNet generalized datasets, and FMNIST-MNIST, SVHN-CIFAR100 mixed datasets show that our SCA-SNN model achieves superior performance compared to both SNN-based and DNN-based continual learning algorithms. Additionally, our algorithm has the capability to adaptively select similar groups of neurons for related tasks, offering a promising approach to enhancing the biological interpretability of efficient continual learning.

摘要

生物大脑有能力根据任务上下文自适应地协调相关神经元群体,以便在现实世界环境中学习不断变化的任务。然而,现有的基于脉冲神经网络的持续学习算法对每个任务一视同仁,忽略了不同任务相似性关联对网络学习的指导作用,这限制了知识利用效率。受大脑上下文依赖可塑性机制的启发,我们提出了一种基于相似性的上下文感知脉冲神经网络(SCA-SNN)持续学习算法,以有效地完成任务增量学习和类别增量学习。基于跨任务的上下文相似性,SCA-SNN模型可以自适应地重用先前任务中对新任务有益的神经元(越相似,重用的神经元越多),并灵活地为新任务扩展新神经元(越相似,扩展的神经元越少)。选择性重用和区分性扩展显著提高了先前知识的利用率并降低了能耗。在CIFAR100、ImageNet广义数据集以及FMNIST-MNIST、SVHN-CIFAR100混合数据集上的大量实验结果表明,我们的SCA-SNN模型与基于SNN和基于DNN的持续学习算法相比,具有更优的性能。此外,我们的算法有能力为相关任务自适应地选择相似的神经元组,为增强高效持续学习的生物学可解释性提供了一种有前景的方法。

相似文献

1
Similarity-based context aware continual learning for spiking neural networks.基于相似度的上下文感知持续学习的脉冲神经网络
Neural Netw. 2025 Apr;184:107037. doi: 10.1016/j.neunet.2024.107037. Epub 2024 Dec 12.
2
Adaptive Synaptic Scaling in Spiking Networks for Continual Learning and Enhanced Robustness.用于持续学习和增强鲁棒性的脉冲神经网络中的自适应突触缩放
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):5151-5165. doi: 10.1109/TNNLS.2024.3373599. Epub 2025 Feb 28.
3
CDNA-SNN: A New Spiking Neural Network for Pattern Classification Using Neuronal Assemblies.CDNA-SNN:一种用于模式分类的新型基于神经元集群的脉冲神经网络。
IEEE Trans Neural Netw Learn Syst. 2025 Feb;36(2):2274-2287. doi: 10.1109/TNNLS.2024.3353571. Epub 2025 Feb 6.
4
A review of learning in biologically plausible spiking neural networks.生物启发式尖峰神经网络学习的综述。
Neural Netw. 2020 Feb;122:253-272. doi: 10.1016/j.neunet.2019.09.036. Epub 2019 Oct 11.
5
RRAM-Based Spiking Neural Network With Target-Modulated Spike-Timing-Dependent Plasticity.基于忆阻器的具有目标调制的基于脉冲时间的可塑性的脉冲神经网络。
IEEE Trans Biomed Circuits Syst. 2025 Apr;19(2):385-392. doi: 10.1109/TBCAS.2024.3446177. Epub 2025 Apr 2.
6
SNN-BERT: Training-efficient Spiking Neural Networks for energy-efficient BERT.SNN-BERT:用于节能 BERT 的高能效脉冲神经网络。
Neural Netw. 2024 Dec;180:106630. doi: 10.1016/j.neunet.2024.106630. Epub 2024 Aug 20.
7
Multi-scale full spike pattern for semantic segmentation.多尺度全尖峰模式的语义分割。
Neural Netw. 2024 Aug;176:106330. doi: 10.1016/j.neunet.2024.106330. Epub 2024 Apr 20.
8
Hybrid neural networks for continual learning inspired by corticohippocampal circuits.受皮质-海马回路启发的用于持续学习的混合神经网络。
Nat Commun. 2025 Feb 2;16(1):1272. doi: 10.1038/s41467-025-56405-9.
9
Multi-compartment neuron and population encoding powered spiking neural network for deep distributional reinforcement learning.用于深度分布式强化学习的多隔室神经元与群体编码驱动的脉冲神经网络
Neural Netw. 2025 Feb;182:106898. doi: 10.1016/j.neunet.2024.106898. Epub 2024 Nov 17.
10
Triple-Memory Networks: A Brain-Inspired Method for Continual Learning.三记忆网络:一种受大脑启发的持续学习方法。
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):1925-1934. doi: 10.1109/TNNLS.2021.3111019. Epub 2022 May 2.

引用本文的文献

1
Continual familiarity decoding from recurrent connections in spiking networks.基于脉冲神经网络中循环连接的持续熟悉度解码
PLoS Comput Biol. 2025 Aug 1;21(8):e1013304. doi: 10.1371/journal.pcbi.1013304. eCollection 2025 Aug.