Zheng Yunfei, Chen Badong, Wang Shiyuan, Wang Weiqun, Qin Wei
IEEE Trans Neural Netw Learn Syst. 2022 Feb;33(2):811-825. doi: 10.1109/TNNLS.2020.3029198. Epub 2022 Feb 3.
Kernel-based extreme learning machine (KELM), as a natural extension of ELM to kernel learning, has achieved outstanding performance in addressing various regression and classification problems. Compared with the basic ELM, KELM has a better generalization ability owing to no needs of the number of hidden nodes given beforehand and random projection mechanism. Since KELM is derived under the minimum mean square error (MMSE) criterion for the Gaussian assumption of noise, its performance may deteriorate under the non-Gaussian cases, seriously. To improve the robustness of KELM, this article proposes a mixture correntropy-based KELM (MC-KELM), which adopts the recently proposed maximum mixture correntropy criterion as the optimization criterion, instead of using the MMSE criterion. In addition, an online sequential version of MC-KELM (MCOS-KELM) is developed to deal with the case that the data arrive sequentially (one-by-one or chunk-by-chunk). Experimental results on regression and classification data sets are reported to validate the performance superiorities of the new methods.
基于核的极限学习机(KELM)作为极限学习机(ELM)向核学习的自然扩展,在解决各种回归和分类问题方面取得了优异的性能。与基本的ELM相比,KELM由于无需预先给定隐藏节点数量和随机投影机制,具有更好的泛化能力。由于KELM是在噪声高斯假设下的最小均方误差(MMSE)准则下推导出来的,在非高斯情况下其性能可能会严重恶化。为了提高KELM的鲁棒性,本文提出了一种基于混合核相关熵的KELM(MC-KELM),它采用最近提出的最大混合核相关熵准则作为优化准则,而不是使用MMSE准则。此外,还开发了MC-KELM的在线序贯版本(MCOS-KELM)来处理数据按顺序(逐个或逐块)到达的情况。报告了在回归和分类数据集上的实验结果,以验证新方法的性能优势。