• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

吸引子神经网络的一步学习的组合记忆。

Compositional memory in attractor neural networks with one-step learning.

机构信息

Department of Computer Science, University of Maryland, College Park, MD, USA.

Department of Elec. Engr. and Comp. Sci., Syracuse University, Syracuse, NY, USA.

出版信息

Neural Netw. 2021 Jun;138:78-97. doi: 10.1016/j.neunet.2021.01.031. Epub 2021 Feb 11.

DOI:10.1016/j.neunet.2021.01.031
PMID:33631609
Abstract

Compositionality refers to the ability of an intelligent system to construct models out of reusable parts. This is critical for the productivity and generalization of human reasoning, and is considered a necessary ingredient for human-level artificial intelligence. While traditional symbolic methods have proven effective for modeling compositionality, artificial neural networks struggle to learn systematic rules for encoding generalizable structured models. We suggest that this is due in part to short-term memory that is based on persistent maintenance of activity patterns without fast weight changes. We present a recurrent neural network that encodes structured representations as systems of contextually-gated dynamical attractors called attractor graphs. This network implements a functionally compositional working memory that is manipulated using top-down gating and fast local learning. We evaluate this approach with empirical experiments on storage and retrieval of graph-based data structures, as well as an automated hierarchical planning task. Our results demonstrate that compositional structures can be stored in and retrieved from neural working memory without persistent maintenance of multiple activity patterns. Further, memory capacity is improved by the use of a fast store-erase learning rule that permits controlled erasure and mutation of previously learned associations. We conclude that the combination of top-down gating and fast associative learning provides recurrent neural networks with a robust functional mechanism for compositional working memory.

摘要

组合性是指智能系统能够构建可重用部分的模型的能力。这对于人类推理的生产力和泛化至关重要,并且被认为是人类水平人工智能的必要组成部分。虽然传统的符号方法已被证明对于建模组合性是有效的,但人工神经网络难以学习编码可泛化的结构化模型的系统规则。我们认为,这部分是由于基于持久维持活动模式而没有快速权重变化的短期记忆。我们提出了一种递归神经网络,该网络将结构化表示编码为称为吸引子图的上下文门控动态吸引子系统。该网络实现了使用自上而下的门控和快速局部学习来操纵的功能组合工作记忆。我们通过对基于图的数据结构的存储和检索以及自动分层规划任务的经验实验来评估这种方法。我们的结果表明,组合结构可以存储在神经工作记忆中并从中检索,而无需持久维持多个活动模式。此外,通过使用允许受控擦除和突变先前学习的关联的快速存储-擦除学习规则,提高了存储容量。我们得出结论,自上而下的门控和快速联想学习的结合为递归神经网络提供了用于组合工作记忆的稳健功能机制。

相似文献

1
Compositional memory in attractor neural networks with one-step learning.吸引子神经网络的一步学习的组合记忆。
Neural Netw. 2021 Jun;138:78-97. doi: 10.1016/j.neunet.2021.01.031. Epub 2021 Feb 11.
2
NeuroLISP: High-level symbolic programming with attractor neural networks.NeuroLISP:基于吸引子神经网络的高级符号编程。
Neural Netw. 2022 Feb;146:200-219. doi: 10.1016/j.neunet.2021.11.009. Epub 2021 Nov 18.
3
A programmable neural virtual machine based on a fast store-erase learning rule.基于快速存储擦除学习规则的可编程神经虚拟机。
Neural Netw. 2019 Nov;119:10-30. doi: 10.1016/j.neunet.2019.07.017. Epub 2019 Jul 26.
4
Network capacity analysis for latent attractor computation.用于潜在吸引子计算的网络容量分析。
Network. 2003 May;14(2):273-302.
5
Engineering neural systems for high-level problem solving.工程化神经系统以解决高级别问题。
Neural Netw. 2016 Jul;79:37-52. doi: 10.1016/j.neunet.2016.03.006. Epub 2016 Mar 31.
6
Gated spiking neural network using Iterative Free-Energy Optimization and rank-order coding for structure learning in memory sequences (INFERNO GATE).基于迭代自由能优化和排序编码的门控尖峰神经网络用于记忆序列中的结构学习 (INFERNO GATE)。
Neural Netw. 2020 Jan;121:242-258. doi: 10.1016/j.neunet.2019.09.023. Epub 2019 Sep 25.
7
Flexible multitask computation in recurrent networks utilizes shared dynamical motifs.递归网络中的灵活多任务计算利用了共享的动态模式。
Nat Neurosci. 2024 Jul;27(7):1349-1363. doi: 10.1038/s41593-024-01668-6. Epub 2024 Jul 9.
8
Memory dynamics in attractor networks with saliency weights.吸引子网络中的记忆动力学与显著权重。
Neural Comput. 2010 Jul;22(7):1899-926. doi: 10.1162/neco.2010.07-09-1050.
9
Flexible Working Memory Through Selective Gating and Attentional Tagging.通过选择性门控和注意力标记实现灵活的工作记忆。
Neural Comput. 2021 Jan;33(1):1-40. doi: 10.1162/neco_a_01339. Epub 2020 Oct 20.
10
Persistent learning signals and working memory without continuous attractors.无连续吸引子的持续学习信号与工作记忆
ArXiv. 2023 Aug 24:arXiv:2308.12585v1.

引用本文的文献

1
Bridging Neuroscience and AI: Environmental Enrichment as a model for forward knowledge transfer in continual learning.架起神经科学与人工智能的桥梁:环境富集作为持续学习中前瞻性知识转移的模型。
ArXiv. 2025 Jan 23:arXiv:2405.07295v3.
2
Tunable Neural Encoding of a Symbolic Robotic Manipulation Algorithm.一种符号机器人操作算法的可调谐神经编码
Front Neurorobot. 2021 Dec 14;15:744031. doi: 10.3389/fnbot.2021.744031. eCollection 2021.