Research Center of Machine Learning and Data Analysis, School of Computer Science and Technology, Soochow University, Suzhou 215006, Jiangsu, China.
Neural Netw. 2013 Dec;48:32-43. doi: 10.1016/j.neunet.2013.07.005. Epub 2013 Jul 15.
This paper deals with fast methods for training a 1-norm support vector machine (SVM). First, we define a specific class of linear programming with many sparse constraints, i.e., row-column sparse constraint linear programming (RCSC-LP). In nature, the 1-norm SVM is a sort of RCSC-LP. In order to construct subproblems for RCSC-LP and solve them, a family of row-column generation (RCG) methods is introduced. RCG methods belong to a category of decomposition techniques, and perform row and column generations in a parallel fashion. Specially, for the 1-norm SVM, the maximum size of subproblems of RCG is identical with the number of Support Vectors (SVs). We also introduce a semi-deleting rule for RCG methods and prove the convergence of RCG methods when using the semi-deleting rule. Experimental results on toy data and real-world datasets illustrate that it is efficient to use RCG to train the 1-norm SVM, especially in the case of small SVs.
本文讨论了训练 1-范数支持向量机(SVM)的快速方法。首先,我们定义了一类具有许多稀疏约束的线性规划,即行-列稀疏约束线性规划(RCSC-LP)。本质上,1-范数 SVM 就是一种 RCSC-LP。为了构造 RCSC-LP 的子问题并求解它们,引入了一类行-列生成(RCG)方法。RCG 方法属于分解技术的一类,以并行的方式进行行和列的生成。特别地,对于 1-范数 SVM,RCG 的子问题的最大规模与支持向量(SV)的数量相同。我们还引入了 RCG 方法的半删除规则,并证明了当使用半删除规则时 RCG 方法的收敛性。在玩具数据和真实数据集上的实验结果表明,使用 RCG 来训练 1-范数 SVM 是高效的,尤其是在 SV 数量较少的情况下。