IEEE Trans Cybern. 2019 Feb;49(2):688-697. doi: 10.1109/TCYB.2017.2786719. Epub 2018 Jan 8.
Generalized eigenvalue proximal support vector machines (GEPSVMs) are a simple and effective binary classification method in which each hyperplane is closest to one of the two classes and as far as possible from the other class. They solve a pair of generalized eigenvalue problems to obtain two nonparallel hyperplanes. Multiview learning considers learning with multiple feature sets to improve the learning performance. In this paper, we propose multiview GEPSVMs (MvGSVMs) which effectively combine two views by introducing a multiview co-regularization term to maximize the consensus on distinct views, and skillfully transform a complicated optimization problem to a simple generalized eigenvalue problem. We also propose multiview improved GEPSVMs (MvIGSVMs), which use the minus instead of ratio in MvGSVMs to measure the differences of the distances between the two classes and the hyperplane and lead to a simpler eigenvalue problem. Linear MvGSVMs and MvIGSVMs are generalized to the nonlinear case by the kernel trick. Experimental results on multiple data sets show the effectiveness of our proposed approaches.
广义特征近支持向量机(GEPSVM)是一种简单而有效的二分类方法,其中每个超平面都最接近两类中的一类,并尽可能远离另一类。它们通过求解一对广义特征值问题来获得两个非平行的超平面。多视图学习考虑使用多个特征集进行学习,以提高学习性能。在本文中,我们提出了多视图 GEPSVM(MvGSVM),通过引入多视图协同正则化项来有效地组合两个视图,以最大化不同视图之间的一致性,并巧妙地将复杂的优化问题转换为简单的广义特征值问题。我们还提出了多视图改进的 GEPSVM(MvIGSVM),它在 MvGSVM 中使用负数代替比率来度量两类与超平面之间距离的差异,并导致更简单的特征值问题。通过核技巧,线性 MvGSVM 和 MvIGSVM 被推广到非线性情况。在多个数据集上的实验结果表明了我们提出的方法的有效性。