Universidade Federal do Espírito Santo - UFES, Vitória - ES, Brazil.
Neural Netw. 2012 Nov;35:70-81. doi: 10.1016/j.neunet.2012.08.003. Epub 2012 Aug 23.
This paper proposes a technique, called Evolving Probabilistic Neural Network (ePNN), that presents many interesting features, including incremental learning, evolving architecture, the capacity to learn continually throughout its existence and requiring that each training sample be used only once in the training phase without reprocessing. A series of experiments was performed on data sets in the public domain; the results indicate that ePNN is superior or equal to the other incremental neural networks evaluated in this paper. These results also demonstrate the advantage of the small ePNN architecture and show that its architecture is more stable than the other incremental neural networks evaluated. ePNN thus appears to be a promising alternative for a quick learning system and a fast classifier with a low computational cost.
本文提出了一种名为进化概率神经网络(ePNN)的技术,它具有许多有趣的特点,包括增量学习、进化架构、在其存在期间持续学习的能力以及要求每个训练样本在训练阶段仅使用一次而无需重新处理。在公共领域的数据集上进行了一系列实验;结果表明,ePNN 优于或等于本文评估的其他增量神经网络。这些结果还表明了小 ePNN 架构的优势,并表明其架构比评估的其他增量神经网络更稳定。因此,ePNN 似乎是一种快速学习系统和快速分类器的有前途的选择,具有低计算成本。