Yang Jian-Bo, Ong Chong-Jin
Department of Mechanical Engineering, National University of Singapore, Singapore.
IEEE Trans Neural Netw. 2011 Jun;22(6):954-62. doi: 10.1109/TNN.2011.2128342. Epub 2011 May 5.
This paper presents a new wrapper-based feature selection method for support vector regression (SVR) using its probabilistic predictions. The method computes the importance of a feature by aggregating the difference, over the feature space, of the conditional density functions of the SVR prediction with and without the feature. As the exact computation of this importance measure is expensive, two approximations are proposed. The effectiveness of the measure using these approximations, in comparison to several other existing feature selection methods for SVR, is evaluated on both artificial and real-world problems. The result of the experiments show that the proposed method generally performs better than, or at least as well as, the existing methods, with notable advantage when the dataset is sparse.
本文提出了一种基于包装器的支持向量回归(SVR)新特征选择方法,该方法利用其概率预测。该方法通过汇总在特征空间上有该特征和无该特征时SVR预测的条件密度函数的差异来计算特征的重要性。由于这种重要性度量的精确计算成本很高,因此提出了两种近似方法。在人工和实际问题上,评估了使用这些近似方法的度量与其他几种现有的SVR特征选择方法相比的有效性。实验结果表明,所提出的方法通常比现有方法表现更好,或者至少与现有方法表现相当,在数据集稀疏时具有显著优势。