Suppr超能文献

最小二乘支持向量机的快速稀疏逼近

Fast sparse approximation for least squares support vector machine.

作者信息

Jiao Licheng, Bo Liefeng, Wang Ling

机构信息

Institute of Intelligent Information Processing, Xi- ' ian University, Xi'an 710071, China.

出版信息

IEEE Trans Neural Netw. 2007 May;18(3):685-97. doi: 10.1109/TNN.2006.889500.

Abstract

In this paper, we present two fast sparse approximation schemes for least squares support vector machine (LS-SVM), named FSALS-SVM and PFSALS-SVM, to overcome the limitation of LS-SVM that it is not applicable to large data sets and to improve test speed. FSALS-SVM iteratively builds the decision function by adding one basis function from a kernel-based dictionary at one time. The process is terminated by using a flexible and stable epsilon insensitive stopping criterion. A probabilistic speedup scheme is employed to further improve the speed of FSALS-SVM and the resulting classifier is named PFSALS-SVM. Our algorithms are of two compelling features: low complexity and sparse solution. Experiments on benchmark data sets show that our algorithms obtain sparse classifiers at a rather low cost without sacrificing the generalization performance.

摘要

在本文中,我们提出了两种用于最小二乘支持向量机(LS-SVM)的快速稀疏逼近方案,即FSALS-SVM和PFSALS-SVM,以克服LS-SVM不适用于大数据集的局限性并提高测试速度。FSALS-SVM通过每次从基于核的字典中添加一个基函数来迭代构建决策函数。该过程通过使用灵活且稳定的ε不敏感停止准则来终止。采用概率加速方案进一步提高FSALS-SVM的速度,由此产生的分类器称为PFSALS-SVM。我们的算法具有两个引人注目的特点:低复杂度和稀疏解。在基准数据集上的实验表明,我们的算法在不牺牲泛化性能的情况下以相当低的成本获得了稀疏分类器。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验