Suppr超能文献

迈向优化自创建神经网络。

Toward optimizing a self-creating neural network.

作者信息

Wang J H, Rau J D, Peng C Y

机构信息

Dept. of Electr. Eng., Nat. Taiwan Ocean Univ., Keelung.

出版信息

IEEE Trans Syst Man Cybern B Cybern. 2000;30(4):586-93. doi: 10.1109/3477.865177.

Abstract

This paper optimizes the performance of the growing cell structures (GCS) model in learning topology and vector quantization. Each node in GCS is attached with a resource counter. During the competitive learning process, the counter of the best-matching node is increased by a defined resource measure after each input presentation, and then all resource counters are decayed by a factor alpha. We show that the summation of all resource counters conserves. This conservation principle provides useful clues for exploring important characteristics of GCS, which in turn provide an insight into how the GCS can be optimized. In the context of information entropy, we show that performance of GCS in learning topology and vector quantization can be optimized by using alpha=0 incorporated with a threshold-free node-removal scheme, regardless of input data being stationary or nonstationary. The meaning of optimization is twofold: (1) for learning topology, the information entropy is maximized in terms of equiprobable criterion and (2) for leaning vector quantization, the use is minimized in terms of equi-error criterion.

摘要

本文优化了生长细胞结构(GCS)模型在学习拓扑结构和矢量量化方面的性能。GCS中的每个节点都附有一个资源计数器。在竞争学习过程中,每次输入呈现后,最佳匹配节点的计数器会按定义的资源度量增加,然后所有资源计数器按α因子衰减。我们表明所有资源计数器的总和是守恒的。这一守恒原理为探索GCS的重要特性提供了有用线索,进而深入了解如何优化GCS。在信息熵的背景下,我们表明,无论输入数据是平稳的还是非平稳的,通过使用α = 0并结合无阈值节点去除方案,可以优化GCS在学习拓扑结构和矢量量化方面的性能。优化的意义有两方面:(1)对于学习拓扑结构,信息熵在等概率准则下最大化;(2)对于学习矢量量化,在等误差准则下使用最小化。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验