Williams Peter, Li Sheng, Feng Jianfeng, Wu Si
IEEE Trans Neural Netw. 2007 May;18(3):942-7. doi: 10.1109/TNN.2007.891625.
The performance of a support vector machine (SVM) largely depends on the kernel function used. This letter investigates a geometrical method to optimize the kernel function. The method is a modification of the one proposed by S. Amari and S. Wu. Its concern is the use of the prior knowledge obtained in a primary step training to conformally rescale the kernel function, so that the separation between the two classes of data is enlarged. The result is that the new algorithm works efficiently and overcomes the susceptibility of the original method.
支持向量机(SVM)的性能在很大程度上取决于所使用的核函数。本文研究了一种优化核函数的几何方法。该方法是对S. Amari和S. Wu提出的方法的改进。其关注点在于利用在初步训练中获得的先验知识对核函数进行共形重新缩放,从而扩大两类数据之间的间隔。结果是新算法高效运行并克服了原方法的易感性。