College of Mathematics and Computer Science, Fuzhou University, Fuzhou 350108, China; Fujian Provincial Key Laboratory of Network Computing and Intelligent Information Processing (Fuzhou University), Fuzhou 350108, China.
College of Mathematics and Computer Science, Fuzhou University, Fuzhou 350108, China; Fujian Provincial Key Laboratory of Network Computing and Intelligent Information Processing (Fuzhou University), Fuzhou 350108, China.
Neural Netw. 2014 Sep;57:51-62. doi: 10.1016/j.neunet.2014.05.014. Epub 2014 Jun 2.
For a practical pattern classification task solved by kernel methods, the computing time is mainly spent on kernel learning (or training). However, the current kernel learning approaches are based on local optimization techniques, and hard to have good time performances, especially for large datasets. Thus the existing algorithms cannot be easily extended to large-scale tasks. In this paper, we present a fast Gaussian kernel learning method by solving a specially structured global optimization (SSGO) problem. We optimize the Gaussian kernel function by using the formulated kernel target alignment criterion, which is a difference of increasing (d.i.) functions. Through using a power-transformation based convexification method, the objective criterion can be represented as a difference of convex (d.c.) functions with a fixed power-transformation parameter. And the objective programming problem can then be converted to a SSGO problem: globally minimizing a concave function over a convex set. The SSGO problem is classical and has good solvability. Thus, to find the global optimal solution efficiently, we can adopt the improved Hoffman's outer approximation method, which need not repeat the searching procedure with different starting points to locate the best local minimum. Also, the proposed method can be proven to converge to the global solution for any classification task. We evaluate the proposed method on twenty benchmark datasets, and compare it with four other Gaussian kernel learning methods. Experimental results show that the proposed method stably achieves both good time-efficiency performance and good classification performance.
对于通过核方法解决的实际模式分类任务,计算时间主要花费在核学习(或训练)上。然而,当前的核学习方法基于局部优化技术,难以具有良好的时间性能,尤其是对于大型数据集。因此,现有算法不容易扩展到大规模任务。在本文中,我们通过解决特殊结构的全局优化(SSGO)问题,提出了一种快速的高斯核学习方法。我们通过使用所提出的核目标对准准则来优化高斯核函数,该准则是递增(d.i.)函数的差异。通过使用基于幂变换的凸化方法,可以将目标准则表示为具有固定幂变换参数的凸(d.c.)函数的差异。然后,目标规划问题可以转换为 SSGO 问题:在凸集上全局最小化凹函数。SSGO 问题是经典的,具有良好的可解性。因此,为了有效地找到全局最优解,我们可以采用改进的 Hoffman 外部逼近方法,该方法不需要重复不同起点的搜索过程来定位最佳局部最小点。此外,所提出的方法可以证明对于任何分类任务都能收敛到全局解。我们在二十个基准数据集上评估了所提出的方法,并将其与其他四种高斯核学习方法进行了比较。实验结果表明,所提出的方法在时间效率和分类性能方面都能稳定地取得良好的效果。