Wu Yan, Jin Yunzhi
Yunnan Key Laboratory of Statistical Modeling and Data Analysis, Yunnan University, Kunming, China.
Front Big Data. 2024 Jul 2;7:1382144. doi: 10.3389/fdata.2024.1382144. eCollection 2024.
Low-rank tensor completion (LRTC), which aims to complete missing entries from tensors with partially observed terms by utilizing the low-rank structure of tensors, has been widely used in various real-world issues. The core tensor nuclear norm minimization (CTNM) method based on Tucker decomposition is one of common LRTC methods. However, the CTNM methods based on Tucker decomposition often have a large computing cost due to the fact that the general factor matrix solving technique involves multiple singular value decompositions (SVDs) in each loop. To address this problem, this article enhances the method and proposes an effective CTNM method based on thin QR decomposition (CTNM-QR) with lower computing complexity. The proposed method extends the CTNM by introducing tensor versions of the auxiliary variables instead of matrices, while using the thin QR decomposition to solve the factor matrix rather than the SVD, which can save the computational complexity and improve the tensor completion accuracy. In addition, the CTNM-QR method's convergence and complexity are analyzed further. Numerous experiments in synthetic data, real color images, and brain MRI data at different missing rates demonstrate that the proposed method not only outperforms in terms of completion accuracy and visualization, but also conducts more efficiently than most state-of-the-art LRTC methods.
低秩张量补全(LRTC)旨在通过利用张量的低秩结构来补全具有部分观测项的张量中的缺失项,已广泛应用于各种实际问题中。基于塔克分解的核心张量核范数最小化(CTNM)方法是常见的LRTC方法之一。然而,基于塔克分解的CTNM方法通常计算成本较高,因为一般的因子矩阵求解技术在每个循环中都涉及多次奇异值分解(SVD)。为了解决这个问题,本文对该方法进行了改进,提出了一种基于精简QR分解(CTNM-QR)的有效CTNM方法,其计算复杂度更低。所提出的方法通过引入辅助变量的张量版本而非矩阵来扩展CTNM,同时使用精简QR分解来求解因子矩阵而非SVD,这可以节省计算复杂度并提高张量补全精度。此外,还进一步分析了CTNM-QR方法的收敛性和复杂度。在不同缺失率的合成数据、真实彩色图像和脑MRI数据上进行的大量实验表明,所提出的方法不仅在补全精度和可视化方面表现出色,而且比大多数现有先进LRTC方法更高效。