Chen Kai, Li Rongchun, Dou Yong, Liang Zhengfa, Lv Qi
National Laboratory for Parallel and Distributed Processing, National University of Defense Technology, Changsha, China.
College of Computer, National University of Defense Technology, Changsha, China.
Comput Intell Neurosci. 2017;2017:4629534. doi: 10.1155/2017/4629534. Epub 2017 Feb 13.
Learning to rank algorithm has become important in recent years due to its successful application in information retrieval, recommender system, and computational biology, and so forth. Ranking support vector machine (RankSVM) is one of the state-of-art ranking models and has been favorably used. Nonlinear RankSVM (RankSVM with nonlinear kernels) can give higher accuracy than linear RankSVM (RankSVM with a linear kernel) for complex nonlinear ranking problem. However, the learning methods for nonlinear RankSVM are still time-consuming because of the calculation of kernel matrix. In this paper, we propose a fast ranking algorithm based on kernel approximation to avoid computing the kernel matrix. We explore two types of kernel approximation methods, namely, the Nyström method and random Fourier features. Primal truncated Newton method is used to optimize the pairwise L2-loss (squared Hinge-loss) objective function of the ranking model after the nonlinear kernel approximation. Experimental results demonstrate that our proposed method gets a much faster training speed than kernel RankSVM and achieves comparable or better performance over state-of-the-art ranking algorithms.
近年来,由于排序学习算法在信息检索、推荐系统和计算生物学等领域的成功应用,它变得越来越重要。排序支持向量机(RankSVM)是最先进的排序模型之一,并且已经得到了广泛应用。对于复杂的非线性排序问题,非线性RankSVM(带非线性核的RankSVM)比线性RankSVM(带线性核的RankSVM)能给出更高的准确率。然而,由于核矩阵的计算,非线性RankSVM的学习方法仍然很耗时。在本文中,我们提出了一种基于核近似的快速排序算法,以避免计算核矩阵。我们探索了两种核近似方法,即Nyström方法和随机傅里叶特征。在非线性核近似之后,使用原始截断牛顿法来优化排序模型的成对L2损失(平方铰链损失)目标函数。实验结果表明,我们提出的方法比核RankSVM具有更快的训练速度,并且在性能上与最先进的排序算法相当或更好。