Suppr超能文献

拉格朗日支持向量回归通过无约束凸最小化。

Lagrangian support vector regression via unconstrained convex minimization.

机构信息

School of Computer and Systems Sciences, Jawaharlal Nehru University, New Delhi 110067, India.

出版信息

Neural Netw. 2014 Mar;51:67-79. doi: 10.1016/j.neunet.2013.12.003. Epub 2013 Dec 11.

Abstract

In this paper, a simple reformulation of the Lagrangian dual of the 2-norm support vector regression (SVR) is proposed as an unconstrained minimization problem. This formulation has the advantage that its objective function is strongly convex and further having only m variables, where m is the number of input data points. The proposed unconstrained Lagrangian SVR (ULSVR) is solvable by computing the zeros of its gradient. However, since its objective function contains the non-smooth 'plus' function, two approaches are followed to solve the proposed optimization problem: (i) by introducing a smooth approximation, generate a slightly modified unconstrained minimization problem and solve it; (ii) solve the problem directly by applying generalized derivative. Computational results obtained on a number of synthetic and real-world benchmark datasets showing similar generalization performance with much faster learning speed in accordance with the conventional SVR and training time very close to least squares SVR clearly indicate the superiority of ULSVR solved by smooth and generalized derivative approaches.

摘要

本文提出了一种 2-范数支持向量回归(SVR)拉格朗日对偶的简单重构,作为无约束最小化问题。这种表示形式的优点是其目标函数具有强凸性,并且只有 m 个变量,其中 m 是输入数据点的数量。所提出的无约束拉格朗日 SVR(ULSVR)可以通过计算其梯度的零点来求解。然而,由于其目标函数包含非光滑的“加号”函数,因此有两种方法可以解决所提出的优化问题:(i)通过引入平滑逼近,生成一个略有修改的无约束最小化问题并求解;(ii)通过应用广义导数直接求解问题。在许多合成和真实基准数据集上获得的计算结果表明,与传统的 SVR 相比,学习速度更快,泛化性能相似,并且与最小二乘 SVR 的训练时间非常接近,这清楚地表明了通过平滑和广义导数方法求解的 ULSVR 的优越性。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验