Knebel Tilman, Hochreiter Sepp, Obermayer Klaus
Neural Information Processing Group, Fakultät IV, Technische Universität Berlin, 10587 Berlin, Germany.
Neural Comput. 2008 Jan;20(1):271-87. doi: 10.1162/neco.2008.20.1.271.
We describe a fast sequential minimal optimization (SMO) procedure for solving the dual optimization problem of the recently proposed potential support vector machine (P-SVM). The new SMO consists of a sequence of iteration steps in which the Lagrangian is optimized with respect to either one (single SMO) or two (dual SMO) of the Lagrange multipliers while keeping the other variables fixed. An efficient selection procedure for Lagrange multipliers is given, and two heuristics for improving the SMO procedure are described: block optimization and annealing of the regularization parameter epsilon. A comparison of the variants shows that the dual SMO, including block optimization and annealing, performs efficiently in terms of computation time. In contrast to standard support vector machines (SVMs), the P-SVM is applicable to arbitrary dyadic data sets, but benchmarks are provided against libSVM's epsilon-SVR and C-SVC implementations for problems that are also solvable by standard SVM methods. For those problems, computation time of the P-SVM is comparable to or somewhat higher than the standard SVM. The number of support vectors found by the P-SVM is usually much smaller for the same generalization performance.
我们描述了一种快速序列最小优化(SMO)过程,用于求解最近提出的潜在支持向量机(P-SVM)的对偶优化问题。新的SMO由一系列迭代步骤组成,在这些步骤中,拉格朗日函数针对一个(单SMO)或两个(对偶SMO)拉格朗日乘子进行优化,同时保持其他变量不变。给出了一种拉格朗日乘子的有效选择过程,并描述了两种改进SMO过程的启发式方法:块优化和正则化参数ε的退火。对这些变体的比较表明,包括块优化和退火的对偶SMO在计算时间方面表现高效。与标准支持向量机(SVM)不同,P-SVM适用于任意二元数据集,但针对libSVM的ε-SVR和C-SVC实现提供了基准测试,用于那些也可通过标准SVM方法解决的问题。对于那些问题,P-SVM的计算时间与标准SVM相当或略高。在相同泛化性能下,P-SVM找到的支持向量数量通常要少得多。