Xiong Huilin, Swamy M N S, Ahmad M Omair
Center for Signal Processing and Communications, Department of Electrical and Computer Engineering, Concordia University, Montreal, QC H3G 1M8, Canada.
IEEE Trans Neural Netw. 2005 Mar;16(2):460-74. doi: 10.1109/TNN.2004.841784.
In this paper, we present a method of kernel optimization by maximizing a measure of class separability in the empirical feature space, an Euclidean space in which the training data are embedded in such a way that the geometrical structure of the data in the feature space is preserved. Employing a data-dependent kernel, we derive an effective kernel optimization algorithm that maximizes the class separability of the data in the empirical feature space. It is shown that there exists a close relationship between the class separability measure introduced here and the alignment measure defined recently by Cristianini. Extensive simulations are carried out which show that the optimized kernel is more adaptive to the input data, and leads to a substantial, sometimes significant, improvement in the performance of various data classification algorithms.
在本文中,我们提出了一种通过最大化经验特征空间中的类可分性度量来进行核优化的方法,经验特征空间是一个欧几里得空间,训练数据以保持特征空间中数据几何结构的方式嵌入其中。采用依赖于数据的核,我们推导了一种有效的核优化算法,该算法最大化了经验特征空间中数据的类可分性。结果表明,这里引入的类可分性度量与Cristianini最近定义的对齐度量之间存在密切关系。进行了大量的模拟,结果表明优化后的核对输入数据更具适应性,并能显著提高各种数据分类算法的性能,有时甚至有重大提升。