Nie Feiping, Dong Xia, Tian Lai, Wang Rong, Li Xuelong
IEEE Trans Neural Netw Learn Syst. 2022 Apr;33(4):1702-1713. doi: 10.1109/TNNLS.2020.3043362. Epub 2022 Apr 4.
In this article, we propose a novel feature selection approach, named unsupervised feature selection with constrained l -norm (row-sparsity constrained) and optimized graph (RSOGFS), which unifies feature selection and similarity matrix construction into a general framework instead of independently performing the two-stage process; thus, the similarity matrix preserving the local manifold structure of data can be determined adaptively. Unlike those sparse learning-based feature selection methods that can only solve the relaxation or approximation problems by introducing sparsity regularization term into the objective function, the proposed method directly tackles the original l -norm constrained problem to achieve group feature selection. Two optimization strategies are provided to solve the original sparse constrained problem. The convergence and approximation guarantees for the new algorithms are rigorously proved, and the computational complexity and parameter determination are theoretically analyzed. Experimental results on real-world data sets show that the proposed method for solving a nonconvex problem is superior to the state of the arts for solving the relaxed or approximate convex problems.
在本文中,我们提出了一种新颖的特征选择方法,称为具有约束l -范数(行稀疏性约束)和优化图的无监督特征选择(RSOGFS),该方法将特征选择和相似性矩阵构建统一到一个通用框架中,而不是独立地执行两阶段过程;因此,可以自适应地确定保留数据局部流形结构的相似性矩阵。与那些基于稀疏学习的特征选择方法不同,后者只能通过在目标函数中引入稀疏正则化项来解决松弛或近似问题,本文提出的方法直接处理原始的l -范数约束问题以实现组特征选择。提供了两种优化策略来解决原始的稀疏约束问题。严格证明了新算法的收敛性和近似保证,并从理论上分析了计算复杂度和参数确定。在真实数据集上的实验结果表明,本文提出的用于解决非凸问题的方法优于用于解决松弛或近似凸问题的现有技术。