• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

在线 ν-支持向量机精确化的可行性与有限收敛性分析

Feasibility and finite convergence analysis for accurate on-line ν-support vector machine.

出版信息

IEEE Trans Neural Netw Learn Syst. 2013 Aug;24(8):1304-15. doi: 10.1109/TNNLS.2013.2250300.

DOI:10.1109/TNNLS.2013.2250300
PMID:24808569
Abstract

The ν-support vector machine ( ν-SVM) for classification has the advantage of using a parameter ν on controlling the number of support vectors and margin errors. Recently, an interesting accurate on-line algorithm accurate on-line ν-SVM algorithm (AONSVM) is proposed for training ν-SVM. AONSVM can be viewed as a special case of parametric quadratic programming techniques. It is demonstrated that AONSVM avoids the infeasible updating path as far as possible, and successfully converges to the optimal solution based on experimental analysis. However, because of the differences between AONSVM and classical parametric quadratic programming techniques, there is no theoretical justification for these conclusions. In this paper, we prove the feasibility and finite convergence of AONSVM under two assumptions. The main results of feasibility analysis include: 1) the inverses of the two key matrices in AONSVM always exist; 2) the rules for updating the two key inverse matrices are reliable; 3) the variable ζ can control the adjustment of the sum of all the weights efficiently; and 4) a sample cannot migrate back and forth in successive adjustment steps among the set of margin support vectors, the set of error support vectors, and the set of the remaining vectors. Moreover, the analyses of AONSVM also provide the proofs of the feasibility and finite convergence for accurate on-line C-SVM learning directly.

摘要

ν-支持向量机(ν-SVM)在分类中具有使用参数 ν 控制支持向量和边界错误数量的优势。最近,提出了一种有趣的准确在线 ν-SVM 算法(AONSVM),用于训练 ν-SVM。AONSVM 可以看作是参数二次规划技术的一种特殊情况。通过实验分析证明,AONSVM 尽可能避免不可行的更新路径,并成功收敛到最优解。然而,由于 AONSVM 与经典参数二次规划技术之间的差异,这些结论没有理论依据。在本文中,我们在两个假设下证明了 AONSVM 的可行性和有限收敛性。可行性分析的主要结果包括:1)AONSVM 中的两个关键矩阵的逆总是存在;2)更新两个关键逆矩阵的规则是可靠的;3)变量 ζ 可以有效地控制所有权重的调整;4)在连续调整步骤中,一个样本不能在边界支持向量集、错误支持向量集和剩余向量集之间来回迁移。此外,AONSVM 的分析也直接为准确在线 C-SVM 学习的可行性和有限收敛性提供了证明。

相似文献

1
Feasibility and finite convergence analysis for accurate on-line ν-support vector machine.在线 ν-支持向量机精确化的可行性与有限收敛性分析
IEEE Trans Neural Netw Learn Syst. 2013 Aug;24(8):1304-15. doi: 10.1109/TNNLS.2013.2250300.
2
Accurate on-line ν-support vector learning.在线准确 ν 支持向量学习。
Neural Netw. 2012 Mar;27:51-9. doi: 10.1016/j.neunet.2011.10.006. Epub 2011 Oct 20.
3
Incremental learning for ν-Support Vector Regression.ν-支持向量回归的增量学习。
Neural Netw. 2015 Jul;67:140-50. doi: 10.1016/j.neunet.2015.03.013. Epub 2015 Apr 6.
4
Regularization path for ν-support vector classification.正则化路径 ν-支持向量分类。
IEEE Trans Neural Netw Learn Syst. 2012 May;23(5):800-11. doi: 10.1109/TNNLS.2012.2183644.
5
Training nu-support vector classifiers: theory and algorithms.训练ν-支持向量分类器:理论与算法
Neural Comput. 2001 Sep;13(9):2119-47. doi: 10.1162/089976601750399335.
6
A Robust Regularization Path Algorithm for $\nu $ -Support Vector Classification.一种用于 $\nu $ -支持向量分类的鲁棒正则化路径算法。
IEEE Trans Neural Netw Learn Syst. 2017 May;28(5):1241-1248. doi: 10.1109/TNNLS.2016.2527796. Epub 2016 Feb 24.
7
Incremental Support Vector Learning for Ordinal Regression.序回归的增量支持向量学习。
IEEE Trans Neural Netw Learn Syst. 2015 Jul;26(7):1403-16. doi: 10.1109/TNNLS.2014.2342533. Epub 2014 Aug 12.
8
Kernel Path for ν-Support Vector Classification.ν-支持向量分类的核路径
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):490-501. doi: 10.1109/TNNLS.2021.3097248. Epub 2023 Jan 5.
9
Robust support vector machine-trained fuzzy system.稳健支持向量机训练的模糊系统。
Neural Netw. 2014 Feb;50:154-65. doi: 10.1016/j.neunet.2013.11.013. Epub 2013 Nov 21.
10
Pair- ${v}$ -SVR: A Novel and Efficient Pairing nu-Support Vector Regression Algorithm.对 $v$ 对偶核 - SVR:一种新颖高效的对偶核支持向量回归算法。
IEEE Trans Neural Netw Learn Syst. 2017 Nov;28(11):2503-2515. doi: 10.1109/TNNLS.2016.2598182.

引用本文的文献

1
A Novel Recurrent Neural Network-Based Ultra-Fast, Robust, and Scalable Solver for Inverting a "Time-Varying Matrix".基于新型递归神经网络的超快、鲁棒、可扩展的“时变矩阵”反演求解器。
Sensors (Basel). 2019 Sep 16;19(18):4002. doi: 10.3390/s19184002.
2
Two New PRP Conjugate Gradient Algorithms for Minimization Optimization Models.两种用于最小化优化模型的新型PRP共轭梯度算法。
PLoS One. 2015 Oct 26;10(10):e0140071. doi: 10.1371/journal.pone.0140071. eCollection 2015.
3
A Modified BFGS Formula Using a Trust Region Model for Nonsmooth Convex Minimizations.
一种使用信赖域模型的非光滑凸极小化的修正BFGS公式。
PLoS One. 2015 Oct 26;10(10):e0140606. doi: 10.1371/journal.pone.0140606. eCollection 2015.
4
A Conjugate Gradient Algorithm with Function Value Information and N-Step Quadratic Convergence for Unconstrained Optimization.一种用于无约束优化的具有函数值信息和N步二次收敛性的共轭梯度算法。
PLoS One. 2015 Sep 18;10(9):e0137166. doi: 10.1371/journal.pone.0137166. eCollection 2015.
5
A Limited-Memory BFGS Algorithm Based on a Trust-Region Quadratic Model for Large-Scale Nonlinear Equations.
PLoS One. 2015 May 7;10(5):e0120993. doi: 10.1371/journal.pone.0120993. eCollection 2015.