Tsang Ivor Wai-hung, Kwok James Tin-yau
Department of Computer Science, The Hong Kong University of Science and Technology, Clear Water Bay, Hong Kong.
IEEE Trans Neural Netw. 2006 Jan;17(1):48-58. doi: 10.1109/TNN.2005.860848.
The kernel function plays a central role in kernel methods. Most existing methods can only adapt the kernel parameters or the kernel matrix based on empirical data. Recently, Ong et al. introduced the method of hyperkernels which can be used to learn the kernel function directly in an inductive setting. However, the associated optimization problem is a semidefinite program (SDP), which is very computationally expensive, even with the recent advances in interior point methods. In this paper, we show that this learning problem can be equivalently reformulated as a second-order cone program (SOCP), which can then be solved more efficiently than SDPs. Comparison is also made with the kernel matrix learning method proposed by Lanckriet et aL Experimental results on both classification and regression problems, with toy and real-world data sets, show that our proposed SOCP formulation has significant speedup over the original SDP formulation. Moreover, it yields better generalization than Lanckriet et al.'s method, with a speed that is comparable, or sometimes even faster, than their quadratically constrained quadratic program (QCQP) formulation.
核函数在核方法中起着核心作用。大多数现有方法只能根据经验数据调整核参数或核矩阵。最近,Ong等人引入了超核方法,该方法可用于在归纳设置中直接学习核函数。然而,相关的优化问题是一个半定规划(SDP),即使有内点法的最新进展,其计算成本也非常高。在本文中,我们表明这个学习问题可以等效地重新表述为二阶锥规划(SOCP),然后可以比SDP更有效地求解。我们还与Lanckriet等人提出的核矩阵学习方法进行了比较。使用玩具数据集和真实世界数据集进行的分类和回归问题的实验结果表明,我们提出的SOCP公式比原始SDP公式有显著的加速。此外,它比Lanckriet等人的方法具有更好的泛化能力,速度与他们的二次约束二次规划(QCQP)公式相当,有时甚至更快。