School of Computer Science, Qufu Normal University, Rizhao, 276826, China.
Qufu Normal University Library, Qufu Normal University, Rizhao, 276826, China.
BMC Bioinformatics. 2020 Oct 7;21(1):445. doi: 10.1186/s12859-020-03790-1.
As a machine learning method with high performance and excellent generalization ability, extreme learning machine (ELM) is gaining popularity in various studies. Various ELM-based methods for different fields have been proposed. However, the robustness to noise and outliers is always the main problem affecting the performance of ELM.
In this paper, an integrated method named correntropy induced loss based sparse robust graph regularized extreme learning machine (CSRGELM) is proposed. The introduction of correntropy induced loss improves the robustness of ELM and weakens the negative effects of noise and outliers. By using the L-norm to constrain the output weight matrix, we tend to obtain a sparse output weight matrix to construct a simpler single hidden layer feedforward neural network model. By introducing the graph regularization to preserve the local structural information of the data, the classification performance of the new method is further improved. Besides, we design an iterative optimization method based on the idea of half quadratic optimization to solve the non-convex problem of CSRGELM.
The classification results on the benchmark dataset show that CSRGELM can obtain better classification results compared with other methods. More importantly, we also apply the new method to the classification problems of cancer samples and get a good classification effect.
作为一种具有高性能和出色泛化能力的机器学习方法,极限学习机(ELM)在各种研究中越来越受欢迎。已经提出了各种基于 ELM 的方法来应用于不同领域。然而,对噪声和异常值的鲁棒性始终是影响 ELM 性能的主要问题。
在本文中,提出了一种名为基于相关摘失的稀疏鲁棒图正则化极限学习机(CSRGELM)的集成方法。相关摘失的引入提高了 ELM 的鲁棒性,并减弱了噪声和异常值的负面影响。通过使用 L-范数约束输出权值矩阵,我们倾向于获得一个稀疏的输出权值矩阵,以构建一个更简单的单隐层前馈神经网络模型。通过引入图正则化来保留数据的局部结构信息,进一步提高了新方法的分类性能。此外,我们还设计了一种基于半二次优化思想的迭代优化方法来解决 CSRGELM 的非凸问题。
基准数据集上的分类结果表明,CSRGELM 可以获得比其他方法更好的分类结果。更重要的是,我们还将新方法应用于癌症样本的分类问题,并获得了良好的分类效果。