• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于大间隔最小二乘回归的多类分类与特征选择

Multiclass Classification and Feature Selection Based on Least Squares Regression with Large Margin.

作者信息

Zhao Haifeng, Wang Siqi, Wang Zheng

机构信息

College of Computer Science and Technology, Anhui University, Hefei 230601, China

Center for OPTical IMagery Analysis and Learning, Northwestern Polytechnical University, Xi'an 710072, China

出版信息

Neural Comput. 2018 Oct;30(10):2781-2804. doi: 10.1162/neco_a_01116. Epub 2018 Jul 18.

DOI:10.1162/neco_a_01116
PMID:30021086
Abstract

Least squares regression (LSR) is a fundamental statistical analysis technique that has been widely applied to feature learning. However, limited by its simplicity, the local structure of data is easy to neglect, and many methods have considered using orthogonal constraint for preserving more local information. Another major drawback of LSR is that the loss function between soft regression results and hard target values cannot precisely reflect the classification ability; thus, the idea of the large margin constraint is put forward. As a consequence, we pay attention to the concepts of large margin and orthogonal constraint to propose a novel algorithm, orthogonal least squares regression with large margin (OLSLM), for multiclass classification in this letter. The core task of this algorithm is to learn regression targets from data and an orthogonal transformation matrix simultaneously such that the proposed model not only ensures every data point can be correctly classified with a large margin than conventional least squares regression, but also can preserve more local data structure information in the subspace. Our efficient optimization method for solving the large margin constraint and orthogonal constraint iteratively proved to be convergent in both theory and practice. We also apply the large margin constraint in the process of generating a sparse learning model for feature selection via joint [Formula: see text]-norm minimization on both loss function and regularization terms. Experimental results validate that our method performs better than state-of-the-art methods on various real-world data sets.

摘要

最小二乘回归(LSR)是一种基本的统计分析技术,已广泛应用于特征学习。然而,由于其简单性的限制,数据的局部结构很容易被忽略,许多方法已经考虑使用正交约束来保留更多的局部信息。LSR的另一个主要缺点是软回归结果与硬目标值之间的损失函数不能精确反映分类能力;因此,提出了大间隔约束的思想。因此,在本文中,我们关注大间隔和正交约束的概念,提出了一种用于多类分类的新算法——大间隔正交最小二乘回归(OLSLM)。该算法的核心任务是同时从数据和正交变换矩阵中学习回归目标,使得所提出的模型不仅能确保每个数据点都能以比传统最小二乘回归更大的间隔被正确分类,而且能在子空间中保留更多的局部数据结构信息。我们用于迭代求解大间隔约束和正交约束的高效优化方法在理论和实践中都被证明是收敛的。我们还在通过对损失函数和正则化项进行联合[公式:见原文]范数最小化来生成用于特征选择的稀疏学习模型的过程中应用大间隔约束。实验结果验证了我们的方法在各种真实世界数据集上比现有方法表现更好。

相似文献

1
Multiclass Classification and Feature Selection Based on Least Squares Regression with Large Margin.基于大间隔最小二乘回归的多类分类与特征选择
Neural Comput. 2018 Oct;30(10):2781-2804. doi: 10.1162/neco_a_01116. Epub 2018 Jul 18.
2
Discriminative least squares regression for multiclass classification and feature selection.用于多类分类和特征选择的判别最小二乘回归。
IEEE Trans Neural Netw Learn Syst. 2012 Nov;23(11):1738-54. doi: 10.1109/TNNLS.2012.2212721.
3
Retargeted Least Squares Regression Algorithm.重定向最小二乘回归算法。
IEEE Trans Neural Netw Learn Syst. 2015 Sep;26(9):2206-13. doi: 10.1109/TNNLS.2014.2371492. Epub 2014 Dec 2.
4
Subspace Sparse Discriminative Feature Selection.子空间稀疏判别特征选择
IEEE Trans Cybern. 2022 Jun;52(6):4221-4233. doi: 10.1109/TCYB.2020.3025205. Epub 2022 Jun 16.
5
Constrained Low-Rank Learning Using Least Squares-Based Regularization.基于最小二乘正则化的约束低秩学习。
IEEE Trans Cybern. 2017 Dec;47(12):4250-4262. doi: 10.1109/TCYB.2016.2623638. Epub 2016 Nov 10.
6
Pairwise Constraint-Guided Sparse Learning for Feature Selection.基于成对约束的稀疏学习特征选择。
IEEE Trans Cybern. 2016 Jan;46(1):298-310. doi: 10.1109/TCYB.2015.2401733. Epub 2015 Jul 6.
7
Generalized Embedding Regression: A Framework for Supervised Feature Extraction.广义嵌入回归:一种监督特征提取框架
IEEE Trans Neural Netw Learn Syst. 2022 Jan;33(1):185-199. doi: 10.1109/TNNLS.2020.3027602. Epub 2022 Jan 5.
8
A generalized -norm regression based feature selection algorithm.一种基于广义 -范数回归的特征选择算法。
J Appl Stat. 2021 Sep 17;50(3):703-723. doi: 10.1080/02664763.2021.1975662. eCollection 2023.
9
Robust and Sparse Principal Component Analysis With Adaptive Loss Minimization for Feature Selection.基于自适应损失最小化的鲁棒稀疏主成分分析用于特征选择
IEEE Trans Neural Netw Learn Syst. 2024 Mar;35(3):3601-3614. doi: 10.1109/TNNLS.2022.3194896. Epub 2024 Feb 29.
10
Robust Supervised and Semisupervised Least Squares Regression Using ℓ-Norm Minimization.
IEEE Trans Neural Netw Learn Syst. 2023 Nov;34(11):8389-8403. doi: 10.1109/TNNLS.2022.3150102. Epub 2023 Oct 27.

引用本文的文献

1
iMIGS: An innovative AI based prediction system for selecting the best patient-specific glaucoma treatment.iMIGS:一种基于人工智能的创新预测系统,用于选择最适合患者的青光眼治疗方案。
MethodsX. 2023 May 18;10:102209. doi: 10.1016/j.mex.2023.102209. eCollection 2023.