Bo Liefeng, Wang Ling, Jiao Licheng
Institute of Intelligent Information Processing, Xidian University, Xi'an 710071, China.
Neural Comput. 2006 Apr;18(4):961-78. doi: 10.1162/089976606775774642.
Kernel fisher discriminant analysis (KFD) is a successful approach to classification. It is well known that the key challenge in KFD lies in the selection of free parameters such as kernel parameters and regularization parameters. Here we focus on the feature-scaling kernel where each feature individually associates with a scaling factor. A novel algorithm, named FS-KFD, is developed to tune the scaling factors and regularization parameters for the feature-scaling kernel. The proposed algorithm is based on optimizing the smooth leave-one-out error via a gradient-descent method and has been demonstrated to be computationally feasible. FS-KFD is motivated by the following two fundamental facts: the leave-one-out error of KFD can be expressed in closed form and the step function can be approximated by a sigmoid function. Empirical comparisons on artificial and benchmark data sets suggest that FS-KFD improves KFD in terms of classification accuracy.
核Fisher判别分析(KFD)是一种成功的分类方法。众所周知,KFD的关键挑战在于自由参数的选择,如核参数和正则化参数。在这里,我们关注特征缩放核,其中每个特征单独与一个缩放因子相关联。开发了一种名为FS-KFD的新算法来调整特征缩放核的缩放因子和正则化参数。该算法基于通过梯度下降法优化平滑留一法误差,并且已被证明在计算上是可行的。FS-KFD的动机基于以下两个基本事实:KFD的留一法误差可以用封闭形式表示,并且阶跃函数可以用Sigmoid函数近似。在人工数据集和基准数据集上的实证比较表明,FS-KFD在分类准确率方面优于KFD。