Suppr超能文献

结构正则化支持向量机:一种用于结构大间隔分类器的框架。

Structural regularized support vector machine: a framework for structural large margin classifier.

作者信息

Xue Hui, Chen Songcan, Yang Qiang

机构信息

School of Computer Science and Engineering, Southeast University, Nanjing 210016, China.

出版信息

IEEE Trans Neural Netw. 2011 Apr;22(4):573-87. doi: 10.1109/TNN.2011.2108315. Epub 2011 Mar 7.

Abstract

Support vector machine (SVM), as one of the most popular classifiers, aims to find a hyperplane that can separate two classes of data with maximal margin. SVM classifiers are focused on achieving more separation between classes than exploiting the structures in the training data within classes. However, the structural information, as an implicit prior knowledge, has recently been found to be vital for designing a good classifier in different real-world problems. Accordingly, using as much prior structural information in data as possible to help improve the generalization ability of a classifier has yielded a class of effective structural large margin classifiers, such as the structured large margin machine (SLMM) and the Laplacian support vector machine (LapSVM). In this paper, we unify these classifiers into a common framework from the concept of "structural granularity" and the formulation for optimization problems. We exploit the quadratic programming (QP) and second-order cone programming (SOCP) methods, and derive a novel large margin classifier, we call the new classifier the structural regularized support vector machine (SRSVM). Unlike both SLMM at the cross of the cluster granularity and SOCP and LapSVM at the cross of the point granularity and QP, SRSVM is located at the cross of the cluster granularity and QP and thus follows the same optimization formulation as LapSVM to overcome large computational complexity and non-sparse solution in SLMM. In addition, it integrates the compactness within classes with the separability between classes simultaneously. Furthermore, it is possible to derive generalization bounds for these algorithms by using eigenvalue analysis of the kernel matrices. Experimental results demonstrate that SRSVM is often superior in classification and generalization performances to the state-of-the-art algorithms in the framework, both with the same and different structural granularities.

摘要

支持向量机(SVM)作为最流行的分类器之一,旨在找到一个能以最大间隔分隔两类数据的超平面。SVM分类器专注于在类之间实现更大的分隔,而非利用类内训练数据中的结构。然而,结构信息作为一种隐含的先验知识,最近被发现对于在不同实际问题中设计良好的分类器至关重要。因此,尽可能多地利用数据中的先验结构信息来帮助提高分类器的泛化能力,产生了一类有效的结构大间隔分类器,如结构化大间隔机器(SLMM)和拉普拉斯支持向量机(LapSVM)。在本文中,我们从“结构粒度”概念和优化问题的公式化角度,将这些分类器统一到一个通用框架中。我们利用二次规划(QP)和二阶锥规划(SOCP)方法,推导出一种新颖的大间隔分类器,我们将新分类器称为结构正则化支持向量机(SRSVM)。与处于聚类粒度和SOCP交叉点的SLMM以及处于点粒度和QP交叉点的LapSVM不同,SRSVM位于聚类粒度和QP的交叉点,因此遵循与LapSVM相同的优化公式,以克服SLMM中的大计算复杂度和非稀疏解。此外,它同时将类内的紧凑性与类间的可分性结合起来。而且,通过使用核矩阵的特征值分析,可以为这些算法推导泛化界。实验结果表明,无论是在相同还是不同的结构粒度下,SRSVM在分类和泛化性能方面通常都优于该框架中的现有算法。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验