Zheng Yunfei, Chen Badong, Wang Shiyuan, Wang Weiqun
IEEE Trans Neural Netw Learn Syst. 2021 Jul;32(7):3083-3097. doi: 10.1109/TNNLS.2020.3009417. Epub 2021 Jul 6.
As an effective and efficient discriminative learning method, broad learning system (BLS) has received increasing attention due to its outstanding performance in various regression and classification problems. However, the standard BLS is derived under the minimum mean square error (MMSE) criterion, which is, of course, not always a good choice due to its sensitivity to outliers. To enhance the robustness of BLS, we propose in this work to adopt the maximum correntropy criterion (MCC) to train the output weights, obtaining a correntropy-based BLS (C-BLS). Due to the inherent superiorities of MCC, the proposed C-BLS is expected to achieve excellent robustness to outliers while maintaining the original performance of the standard BLS in the Gaussian or noise-free environment. In addition, three alternative incremental learning algorithms, derived from a weighted regularized least-squares solution rather than pseudoinverse formula, for C-BLS are developed. With the incremental learning algorithms, the system can be updated quickly without the entire retraining process from the beginning when some new samples arrive or the network deems to be expanded. Experiments on various regression and classification data sets are reported to demonstrate the desirable performance of the new methods.
作为一种有效且高效的判别式学习方法,广义学习系统(BLS)因其在各种回归和分类问题中的出色表现而受到越来越多的关注。然而,标准的BLS是在最小均方误差(MMSE)准则下推导出来的,由于其对异常值敏感,当然并不总是一个好的选择。为了增强BLS的鲁棒性,我们在这项工作中提出采用最大相关熵准则(MCC)来训练输出权重,从而得到基于相关熵的BLS(C-BLS)。由于MCC固有的优势,所提出的C-BLS有望在对异常值具有出色鲁棒性的同时,在高斯或无噪声环境中保持标准BLS的原始性能。此外,还为C-BLS开发了三种替代的增量学习算法,这些算法源自加权正则化最小二乘解而非伪逆公式。借助这些增量学习算法,当有新样本到达或网络需要扩展时,系统无需从头开始进行整个重新训练过程就能快速更新。报告了在各种回归和分类数据集上的实验,以证明新方法的理想性能。