• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种分层 ART 网络,用于从嘈杂数据中稳定地增量学习拓扑结构和关联。

A hierarchical ART network for the stable incremental learning of topological structures and associations from noisy data.

机构信息

Bielefeld University, Applied Informatics, Universitätsstraße 25, 33615 Bielefeld, Germany.

出版信息

Neural Netw. 2011 Oct;24(8):906-16. doi: 10.1016/j.neunet.2011.05.009. Epub 2011 Jun 7.

DOI:10.1016/j.neunet.2011.05.009
PMID:21704496
Abstract

In this article, a novel unsupervised neural network combining elements from Adaptive Resonance Theory and topology-learning neural networks is presented. It enables stable on-line clustering of stationary and non-stationary input data by learning their inherent topology. Here, two network components representing two different levels of detail are trained simultaneously. By virtue of several filtering mechanisms, the sensitivity to noise is diminished, which renders the proposed network suitable for the application to real-world problems. Furthermore, we demonstrate that this network constitutes an excellent basis to learn and recall associations between real-world associative keys. Its incremental nature ensures that the capacity of the corresponding associative memory fits the amount of knowledge to be learnt. Moreover, the formed clusters efficiently represent the relations between the keys, even if noisy data is used for training. In addition, we present an iterative recall mechanism to retrieve stored information based on one of the associative keys used for training. As different levels of detail are learnt, the recall can be performed with different degrees of accuracy.

摘要

本文提出了一种新颖的无监督神经网络,它结合了自适应共振理论和拓扑学习神经网络的元素。它通过学习输入数据的内在拓扑结构,实现了对静态和非静态输入数据的稳定在线聚类。在这里,两个代表不同细节水平的网络组件同时进行训练。通过几个滤波机制,降低了对噪声的敏感性,从而使所提出的网络适用于实际问题的应用。此外,我们证明该网络是学习和回忆真实世界关联键之间关联的优秀基础。它的增量特性确保了相应联想记忆的容量与要学习的知识量相匹配。此外,即使使用噪声数据进行训练,形成的聚类也能有效地表示键之间的关系。此外,我们还提出了一种基于训练所用关联键之一的迭代回忆机制来检索存储的信息。随着不同细节水平的学习,可以以不同的精度进行回忆。

相似文献

1
A hierarchical ART network for the stable incremental learning of topological structures and associations from noisy data.一种分层 ART 网络,用于从嘈杂数据中稳定地增量学习拓扑结构和关联。
Neural Netw. 2011 Oct;24(8):906-16. doi: 10.1016/j.neunet.2011.05.009. Epub 2011 Jun 7.
2
An incremental network for on-line unsupervised classification and topology learning.一种用于在线无监督分类和拓扑学习的增量网络。
Neural Netw. 2006 Jan;19(1):90-106. doi: 10.1016/j.neunet.2005.04.006. Epub 2005 Sep 8.
3
An enhanced self-organizing incremental neural network for online unsupervised learning.一种用于在线无监督学习的增强型自组织增量神经网络。
Neural Netw. 2007 Oct;20(8):893-903. doi: 10.1016/j.neunet.2007.07.008. Epub 2007 Aug 14.
4
Boosted ARTMAP: modifications to fuzzy ARTMAP motivated by boosting theory.增强型ARTMAP:受增强理论启发对模糊ARTMAP的改进。
Neural Netw. 2006 May;19(4):446-68. doi: 10.1016/j.neunet.2005.08.013. Epub 2005 Dec 15.
5
GFAM: evolving Fuzzy ARTMAP neural networks.GFAM:不断演进的模糊ARTMAP神经网络。
Neural Netw. 2007 Oct;20(8):874-92. doi: 10.1016/j.neunet.2007.05.006. Epub 2007 Jun 3.
6
Clustering: a neural network approach.聚类:神经网络方法。
Neural Netw. 2010 Jan;23(1):89-107. doi: 10.1016/j.neunet.2009.08.007. Epub 2009 Aug 29.
7
Meta-learning approach to neural network optimization.元学习方法在神经网络优化中的应用。
Neural Netw. 2010 May;23(4):568-82. doi: 10.1016/j.neunet.2010.02.003. Epub 2010 Feb 20.
8
Adaptive categorization of ART networks in robot behavior learning using game-theoretic formulation.在机器人行为学习中使用博弈论公式对自适应共振理论网络进行自适应分类。
Neural Netw. 2003 Dec;16(10):1403-20. doi: 10.1016/S0893-6080(03)00080-7.
9
A new bidirectional heteroassociative memory encompassing correlational, competitive and topological properties.一种包含相关性、竞争性和拓扑性质的新型双向异联想记忆。
Neural Netw. 2009 Jul-Aug;22(5-6):568-78. doi: 10.1016/j.neunet.2009.06.011. Epub 2009 Jun 30.
10
An incremental neural network with a reduced architecture.具有简化架构的增量神经网络。
Neural Netw. 2012 Nov;35:70-81. doi: 10.1016/j.neunet.2012.08.003. Epub 2012 Aug 23.