Guo Yu, Sun Yuan, Wang Zheng, Nie Feiping, Wang Fei
IEEE Trans Neural Netw Learn Syst. 2024 Oct;35(10):13354-13367. doi: 10.1109/TNNLS.2023.3267184. Epub 2024 Oct 7.
In this article, we propose a novel unsupervised feature selection model combined with clustering, named double-structured sparsity guided flexible embedding learning (DSFEL) for unsupervised feature selection. DSFEL includes a module for learning a block-diagonal structural sparse graph that represents the clustering structure and another module for learning a completely row-sparse projection matrix using the l -norm constraint to select distinctive features. Compared with the commonly used l -norm regularization term, the l -norm constraint can avoid the drawbacks of sparsity limitation and parameter tuning. The optimization of the l -norm constraint problem, which is a nonconvex and nonsmooth problem, is a formidable challenge, and previous optimization algorithms have only been able to provide approximate solutions. In order to address this issue, this article proposes an efficient optimization strategy that yields a closed-form solution. Eventually, through comprehensive experimentation on nine real-world datasets, it is demonstrated that the proposed method outperforms existing state-of-the-art unsupervised feature selection methods.
在本文中,我们提出了一种结合聚类的新型无监督特征选择模型,称为用于无监督特征选择的双结构稀疏引导灵活嵌入学习(DSFEL)。DSFEL包括一个用于学习表示聚类结构的块对角结构稀疏图的模块,以及另一个用于使用l -范数约束学习完全行稀疏投影矩阵以选择独特特征的模块。与常用的l -范数正则化项相比,l -范数约束可以避免稀疏性限制和参数调整的缺点。l -范数约束问题的优化是一个非凸且非光滑的问题,这是一个艰巨的挑战,并且以前的优化算法只能提供近似解。为了解决这个问题,本文提出了一种能产生闭式解的有效优化策略。最终,通过对九个真实世界数据集的全面实验,证明了所提出的方法优于现有的无监督特征选择方法。