Department of Mathematics, Shanghai Normal University, 200234, PR China.
Neural Netw. 2010 Apr;23(3):365-72. doi: 10.1016/j.neunet.2009.07.002. Epub 2009 Jul 10.
The learning speed of classical Support Vector Regression (SVR) is low, since it is constructed based on the minimization of a convex quadratic function subject to the pair groups of linear inequality constraints for all training samples. In this paper we propose Twin Support Vector Regression (TSVR), a novel regressor that determines a pair of -insensitive up- and down-bound functions by solving two related SVM-type problems, each of which is smaller than that in a classical SVR. The TSVR formulation is in the spirit of Twin Support Vector Machine (TSVM) via two nonparallel planes. The experimental results on several artificial and benchmark datasets indicate that the proposed TSVR is not only fast, but also shows good generalization performance.
经典支持向量回归(SVR)的学习速度较慢,因为它是基于对所有训练样本的线性不等式约束对的凸二次函数的最小化构建的。在本文中,我们提出了孪生支持向量回归(TSVR),这是一种新的回归器,它通过求解两个相关的 SVM 型问题来确定一对不敏感的上下界函数,每个问题都比经典 SVR 中的问题小。TSVR 的公式是通过两个非平行平面的孪生支持向量机(TSVM)的精神。在几个人工和基准数据集上的实验结果表明,所提出的 TSVR 不仅速度快,而且具有良好的泛化性能。