Zhong Guangzheng, Xiao Yanshan, Liu Bo, Zhao Liang, Kong Xiangjun
IEEE Trans Neural Netw Learn Syst. 2024 Aug;35(8):11246-11260. doi: 10.1109/TNNLS.2023.3258464. Epub 2024 Aug 5.
Ordinal regression (OR) aims to solve multiclass classification problems with ordinal classes. Support vector OR (SVOR) is a typical OR algorithm and has been extensively used in OR problems. In this article, based on the characteristics of OR problems, we propose a novel pinball loss function and present an SVOR method with pinball loss (pin-SVOR). Pin-SVOR is fundamentally different from traditional SVOR with hinge loss. Traditional SVOR employs the hinge loss function, and the classifier is determined by only a few data points near the class boundary, called support vectors, which may lead to a noise sensitive and re-sampling unstable classifier. Distinctively, pin-SVOR employs the pinball loss function. It attaches an extra penalty to correctly classified data that lies inside the class, such that all the training data is involved in deciding the classifier. The data near the middle of each class has a small penalty, and that near the class boundary has a large penalty. Thus, the training data tend to lie near the middle of each class instead of on the class boundary, which leads to scatter minimization in the middle of each class and noise insensitivity. The experimental results show that pin-SVOR has better classification performance than state-of-the-art OR methods.
有序回归(OR)旨在解决具有有序类别的多类分类问题。支持向量有序回归(SVOR)是一种典型的有序回归算法,已被广泛应用于有序回归问题。在本文中,基于有序回归问题的特点,我们提出了一种新颖的弹球损失函数,并给出了一种带弹球损失的支持向量有序回归方法(pin-SVOR)。Pin-SVOR与传统的带铰链损失的SVOR有根本区别。传统的SVOR采用铰链损失函数,分类器仅由类边界附近的少数数据点(称为支持向量)确定,这可能导致对噪声敏感且重采样不稳定的分类器。与之不同的是,pin-SVOR采用弹球损失函数。它对类内正确分类的数据施加额外惩罚,使得所有训练数据都参与决定分类器。每个类中间附近的数据惩罚较小,而类边界附近的数据惩罚较大。因此,训练数据倾向于位于每个类的中间附近而非类边界上,这导致每个类中间的离散度最小化且对噪声不敏感。实验结果表明,pin-SVOR比现有最先进的有序回归方法具有更好的分类性能。