School of Information, Renmin University of China, Beijing 100872, China.
School of Computer Science and Technology, Huaiyin Normal University, Huai'an, Jiangsu 223300, China.
Comput Intell Neurosci. 2018 Jan 23;2018:1018789. doi: 10.1155/2018/1018789. eCollection 2018.
By utilizing kernel functions, support vector machines (SVMs) successfully solve the linearly inseparable problems. Subsequently, its applicable areas have been greatly extended. Using multiple kernels (MKs) to improve the SVM classification accuracy has been a hot topic in the SVM research society for several years. However, most MK learning (MKL) methods employ -norm constraint on the kernel combination weights, which forms a sparse yet nonsmooth solution for the kernel weights. Alternatively, the -norm constraint on the kernel weights keeps all information in the base kernels. Nonetheless, the solution of -norm constraint MKL is nonsparse and sensitive to the noise. Recently, some scholars presented an efficient sparse generalized MKL (- and -norms based GMKL) method, in which established an elastic constraint on the kernel weights. In this paper, we further extend the GMKL to a more generalized MKL method based on the -norm, by joining - and -norms. Consequently, the - and -norms based GMKL is a special case in our method when = 2. Experiments demonstrated that our - and -norms based MKL offers a higher accuracy than the - and -norms based GMKL in the classification, while keeping the properties of the - and -norms based on GMKL.
通过利用核函数,支持向量机(Support Vector Machine,SVM)成功地解决了线性不可分问题。此后,其适用领域得到了极大的扩展。使用多核函数(Multiple Kernels,MK)来提高 SVM 分类精度已经成为 SVM 研究领域多年来的热门话题。然而,大多数多核学习(MK Learning,MKL)方法在核组合权重上采用范数约束,这为核权重形成了稀疏但不平滑的解。相反,核权重上的范数约束保留了基核中的所有信息。然而,范数约束 MKL 的解是非稀疏的,并且对噪声敏感。最近,一些学者提出了一种有效的稀疏广义多核学习(Generalized MKL Based on - and -norms,GMKL)方法,其中在核权重上建立了弹性约束。在本文中,我们进一步将 GMKL 扩展到一种更广义的基于范数的多核学习方法,将范数和范数结合起来。因此,当 = 2 时,基于范数和范数的 GMKL 是我们方法的一个特例。实验表明,我们提出的基于范数和范数的 MKL 在分类方面提供了比基于范数和范数的 GMKL 更高的精度,同时保持了基于 GMKL 的范数和范数的性质。