Chen Jie, Yang Shengxiang, Wang Zhu, Mao Hua
IEEE Trans Neural Netw Learn Syst. 2023 Aug;34(8):4208-4222. doi: 10.1109/TNNLS.2021.3119278. Epub 2023 Aug 4.
Due to the capability of effectively learning intrinsic structures from high-dimensional data, techniques based on sparse representation have begun to display an impressive impact on several fields, such as image processing, computer vision, and pattern recognition. Learning sparse representations isoften computationally expensive due to the iterative computations needed to solve convex optimization problems in which the number of iterations is unknown before convergence. Moreover, most sparse representation algorithms focus only on determining the final sparse representation results and ignore the changes in the sparsity ratio (SR) during iterative computations. In this article, two algorithms are proposed to learn sparse representations based on locality-constrained linear representation learning with probabilistic simplex constraints. Specifically, the first algorithm, called approximated local linear representation (ALLR), obtains a closed-form solution from individual locality-constrained sparse representations. The second algorithm, called ALLR with symmetric constraints (ALLRSC), further obtains a symmetric sparse representation result with a limited number of computations; notably, the sparsity and convergence of sparse representations can be guaranteed based on theoretical analysis. The steady decline in the SR during iterative computations is a critical factor in practical applications. Experimental results based on public datasets demonstrate that the proposed algorithms perform better than several state-of-the-art algorithms for learning with high-dimensional data.
由于能够从高维数据中有效学习内在结构,基于稀疏表示的技术已开始在多个领域展现出令人瞩目的影响,如图像处理、计算机视觉和模式识别。由于求解凸优化问题需要进行迭代计算,而在收敛前迭代次数未知,学习稀疏表示通常在计算上代价高昂。此外,大多数稀疏表示算法仅专注于确定最终的稀疏表示结果,而忽略了迭代计算过程中稀疏率(SR)的变化。在本文中,提出了两种基于具有概率单纯形约束的局部约束线性表示学习来学习稀疏表示的算法。具体而言,第一种算法称为近似局部线性表示(ALLR),它从单个局部约束稀疏表示中获得闭式解。第二种算法称为具有对称约束的ALLR(ALLRSC),通过有限数量的计算进一步获得对称稀疏表示结果;值得注意的是,基于理论分析可以保证稀疏表示的稀疏性和收敛性。迭代计算过程中SR的稳步下降是实际应用中的一个关键因素。基于公共数据集的实验结果表明,所提出的算法在处理高维数据学习方面比几种现有最先进算法表现更好。