• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

探索用于终身学习的分段吸引子网络的关联学习能力。

Exploring the associative learning capabilities of the segmented attractor network for lifelong learning.

作者信息

Jones Alexander, Jha Rashmi

机构信息

Department of Electrical Engineering and Computer Science, University of Cincinnati, Cincinnati, OH, United States.

出版信息

Front Artif Intell. 2022 Aug 1;5:910407. doi: 10.3389/frai.2022.910407. eCollection 2022.

DOI:10.3389/frai.2022.910407
PMID:35978653
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9376266/
Abstract

This work explores the process of adapting the segmented attractor network to a lifelong learning setting. Taking inspirations from Hopfield networks and content-addressable memory, the segmented attractor network is a powerful tool for associative memory applications. The network's performance as an associative memory is analyzed using multiple metrics. In addition to the network's general hit rate, its capability to recall unique memories and their frequency is also evaluated with respect to time. Finally, additional learning techniques are implemented to enhance the network's recall capacity in the application of lifelong learning. These learning techniques are based on human cognitive functions such as memory consolidation, prediction, and forgetting.

摘要

这项工作探索了将分段吸引子网络应用于终身学习环境的过程。受霍普菲尔德网络和内容可寻址存储器的启发,分段吸引子网络是用于联想记忆应用的强大工具。使用多种指标分析了该网络作为联想记忆的性能。除了网络的一般命中率外,还针对时间评估了其回忆独特记忆的能力及其频率。最后,实施了额外的学习技术,以提高网络在终身学习应用中的回忆能力。这些学习技术基于人类认知功能,如记忆巩固、预测和遗忘。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/f10e5571d92f/frai-05-910407-g0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/d472374ffd90/frai-05-910407-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/20564e26c7ce/frai-05-910407-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/9e917e53996d/frai-05-910407-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/3ea4b07b3598/frai-05-910407-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/af71733bebb5/frai-05-910407-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/84b10bd815d3/frai-05-910407-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/85fe03515300/frai-05-910407-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/024ec72da665/frai-05-910407-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/7201ba84d8e3/frai-05-910407-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/1a9a65146f34/frai-05-910407-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/18b32fc53a59/frai-05-910407-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/1ccc1b19dbbd/frai-05-910407-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/f10e5571d92f/frai-05-910407-g0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/d472374ffd90/frai-05-910407-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/20564e26c7ce/frai-05-910407-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/9e917e53996d/frai-05-910407-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/3ea4b07b3598/frai-05-910407-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/af71733bebb5/frai-05-910407-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/84b10bd815d3/frai-05-910407-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/85fe03515300/frai-05-910407-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/024ec72da665/frai-05-910407-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/7201ba84d8e3/frai-05-910407-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/1a9a65146f34/frai-05-910407-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/18b32fc53a59/frai-05-910407-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/1ccc1b19dbbd/frai-05-910407-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0ab8/9376266/f10e5571d92f/frai-05-910407-g0013.jpg

相似文献

1
Exploring the associative learning capabilities of the segmented attractor network for lifelong learning.探索用于终身学习的分段吸引子网络的关联学习能力。
Front Artif Intell. 2022 Aug 1;5:910407. doi: 10.3389/frai.2022.910407. eCollection 2022.
2
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
3
On stability and associative recall of memories in attractor neural networks.吸引子神经网络中记忆的稳定性和联想回忆。
PLoS One. 2020 Sep 17;15(9):e0238054. doi: 10.1371/journal.pone.0238054. eCollection 2020.
4
Re-encoding of associations by recurrent plasticity increases memory capacity.通过循环可塑性对关联进行重新编码可增加记忆容量。
Front Synaptic Neurosci. 2014 Jun 10;6:13. doi: 10.3389/fnsyn.2014.00013. eCollection 2014.
5
Delay for the capacity-simplicity dilemma in associative memory attractor networks.关联记忆吸引子网络中的容量-简单性困境的延迟。
Neural Netw. 2012 May;29-30:37-51. doi: 10.1016/j.neunet.2012.01.007. Epub 2012 Feb 2.
6
In Search of Dispersed Memories: Generative Diffusion Models Are Associative Memory Networks.寻找分散记忆:生成扩散模型是联想记忆网络。
Entropy (Basel). 2024 Apr 29;26(5):381. doi: 10.3390/e26050381.
7
Vector Symbolic Finite State Machines in Attractor Neural Networks.吸引子神经网络中的向量符号有限状态机
Neural Comput. 2024 Mar 21;36(4):549-595. doi: 10.1162/neco_a_01638.
8
Universal Hopfield Networks: A General Framework for Single-Shot Associative Memory Models.通用霍普菲尔德网络:单触发联想记忆模型的通用框架
Proc Mach Learn Res. 2022 Jul;162:15561-15583.
9
A Multi-functional Memristive Pavlov Associative Memory Circuit Based on Neural Mechanisms.基于神经机制的多功能忆阻 Pavlov 联想记忆电路。
IEEE Trans Biomed Circuits Syst. 2021 Oct;15(5):978-993. doi: 10.1109/TBCAS.2021.3108354. Epub 2021 Dec 9.
10
A new neural network architecture with associative memory, pruning and order-sensitive learning.一种具有关联记忆、剪枝和顺序敏感学习功能的新型神经网络架构。
Int J Neural Syst. 1999 Aug;9(4):351-70. doi: 10.1142/s0129065799000332.

本文引用的文献

1
Bimodular continuous attractor neural networks with static and moving stimuli.具有静态和移动刺激的双模块化连续吸引子神经网络。
Phys Rev E. 2023 Jun;107(6-1):064302. doi: 10.1103/PhysRevE.107.064302.
2
Hippocampal Interictal Spikes during Sleep Impact Long-Term Memory Consolidation.睡眠期海马棘波对长期记忆巩固的影响。
Ann Neurol. 2020 Jun;87(6):976-987. doi: 10.1002/ana.25744. Epub 2020 Apr 24.
3
The Evolutionary Origin of Associative Learning.联想学习的进化起源。
Am Nat. 2020 Jan;195(1):E1-E19. doi: 10.1086/706252. Epub 2019 Nov 26.
4
Toward Training Recurrent Neural Networks for Lifelong Learning.面向终身学习的循环神经网络训练
Neural Comput. 2020 Jan;32(1):1-35. doi: 10.1162/neco_a_01246. Epub 2019 Nov 8.
5
The neurobiological foundation of memory retrieval.记忆检索的神经生物学基础。
Nat Neurosci. 2019 Oct;22(10):1576-1585. doi: 10.1038/s41593-019-0493-1. Epub 2019 Sep 24.
6
Continual lifelong learning with neural networks: A review.神经网络的持续终身学习:综述。
Neural Netw. 2019 May;113:54-71. doi: 10.1016/j.neunet.2019.01.012. Epub 2019 Feb 6.
7
Learning rules for aversive associative memory formation.学习厌恶联想记忆形成的规则。
Curr Opin Neurobiol. 2018 Apr;49:148-157. doi: 10.1016/j.conb.2018.02.010. Epub 2018 Mar 5.
8
The problem of detecting long-term forgetting: Evidence from the Crimes Test and the Four Doors Test.检测长期遗忘问题:来自犯罪测试和四门测试的证据。
Cortex. 2019 Jan;110:69-79. doi: 10.1016/j.cortex.2018.01.017. Epub 2018 Feb 2.
9
C. elegans positive olfactory associative memory is a molecularly conserved behavioral paradigm.秀丽隐杆线虫的正向嗅觉联想记忆是一种分子保守的行为范式。
Neurobiol Learn Mem. 2014 Nov;115:86-94. doi: 10.1016/j.nlm.2014.07.011. Epub 2014 Aug 7.
10
The operation of pattern separation and pattern completion processes associated with different attributes or domains of memory.与不同属性或记忆领域相关的模式分离和模式完成过程的操作。
Neurosci Biobehav Rev. 2013 Jan;37(1):36-58. doi: 10.1016/j.neubiorev.2012.09.014. Epub 2012 Oct 5.