Suppr超能文献

分析 1-范数支持向量机的编程属性和行列生成方法。

Analysis of programming properties and the row-column generation method for 1-norm support vector machines.

机构信息

Research Center of Machine Learning and Data Analysis, School of Computer Science and Technology, Soochow University, Suzhou 215006, Jiangsu, China.

出版信息

Neural Netw. 2013 Dec;48:32-43. doi: 10.1016/j.neunet.2013.07.005. Epub 2013 Jul 15.

Abstract

This paper deals with fast methods for training a 1-norm support vector machine (SVM). First, we define a specific class of linear programming with many sparse constraints, i.e., row-column sparse constraint linear programming (RCSC-LP). In nature, the 1-norm SVM is a sort of RCSC-LP. In order to construct subproblems for RCSC-LP and solve them, a family of row-column generation (RCG) methods is introduced. RCG methods belong to a category of decomposition techniques, and perform row and column generations in a parallel fashion. Specially, for the 1-norm SVM, the maximum size of subproblems of RCG is identical with the number of Support Vectors (SVs). We also introduce a semi-deleting rule for RCG methods and prove the convergence of RCG methods when using the semi-deleting rule. Experimental results on toy data and real-world datasets illustrate that it is efficient to use RCG to train the 1-norm SVM, especially in the case of small SVs.

摘要

本文讨论了训练 1-范数支持向量机(SVM)的快速方法。首先,我们定义了一类具有许多稀疏约束的线性规划,即行-列稀疏约束线性规划(RCSC-LP)。本质上,1-范数 SVM 就是一种 RCSC-LP。为了构造 RCSC-LP 的子问题并求解它们,引入了一类行-列生成(RCG)方法。RCG 方法属于分解技术的一类,以并行的方式进行行和列的生成。特别地,对于 1-范数 SVM,RCG 的子问题的最大规模与支持向量(SV)的数量相同。我们还引入了 RCG 方法的半删除规则,并证明了当使用半删除规则时 RCG 方法的收敛性。在玩具数据和真实数据集上的实验结果表明,使用 RCG 来训练 1-范数 SVM 是高效的,尤其是在 SV 数量较少的情况下。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验