Suppr超能文献

具有简化架构的增量神经网络。

An incremental neural network with a reduced architecture.

机构信息

Universidade Federal do Espírito Santo - UFES, Vitória - ES, Brazil.

出版信息

Neural Netw. 2012 Nov;35:70-81. doi: 10.1016/j.neunet.2012.08.003. Epub 2012 Aug 23.

Abstract

This paper proposes a technique, called Evolving Probabilistic Neural Network (ePNN), that presents many interesting features, including incremental learning, evolving architecture, the capacity to learn continually throughout its existence and requiring that each training sample be used only once in the training phase without reprocessing. A series of experiments was performed on data sets in the public domain; the results indicate that ePNN is superior or equal to the other incremental neural networks evaluated in this paper. These results also demonstrate the advantage of the small ePNN architecture and show that its architecture is more stable than the other incremental neural networks evaluated. ePNN thus appears to be a promising alternative for a quick learning system and a fast classifier with a low computational cost.

摘要

本文提出了一种名为进化概率神经网络(ePNN)的技术,它具有许多有趣的特点,包括增量学习、进化架构、在其存在期间持续学习的能力以及要求每个训练样本在训练阶段仅使用一次而无需重新处理。在公共领域的数据集上进行了一系列实验;结果表明,ePNN 优于或等于本文评估的其他增量神经网络。这些结果还表明了小 ePNN 架构的优势,并表明其架构比评估的其他增量神经网络更稳定。因此,ePNN 似乎是一种快速学习系统和快速分类器的有前途的选择,具有低计算成本。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验