Suppr超能文献

双自动加权张量鲁棒主成分分析

Double Auto-Weighted Tensor Robust Principal Component Analysis.

作者信息

Wang Yulong, Kou Kit Ian, Chen Hong, Tang Yuan Yan, Li Luoqing

出版信息

IEEE Trans Image Process. 2023;32:5114-5125. doi: 10.1109/TIP.2023.3310331. Epub 2023 Sep 12.

Abstract

Tensor Robust Principal Component Analysis (TRPCA), which aims to recover the low-rank and sparse components from their sum, has drawn intensive interest in recent years. Most existing TRPCA methods adopt the tensor nuclear norm (TNN) and the tensor l norm as the regularization terms for the low-rank and sparse components, respectively. However, TNN treats each singular value of the low-rank tensor L equally and the tensor l norm shrinks each entry of the sparse tensor S with the same strength. It has been shown that larger singular values generally correspond to prominent information of the data and should be less penalized. The same goes for large entries in S in terms of absolute values. In this paper, we propose a Double Auto-weighted TRPCA (DATRPCA) method. s Instead of using predefined and manually set weights merely for the low-rank tensor as previous works, DATRPCA automatically and adaptively assigns smaller weights and applies lighter penalization to significant singular values of the low-rank tensor and large entries of the sparse tensor simultaneously. We have further developed an efficient algorithm to implement DATRPCA based on the Alternating Direction Method of Multipliers (ADMM) framework. In addition, we have also established the convergence analysis of the proposed algorithm. The results on both synthetic and real-world data demonstrate the effectiveness of DATRPCA for low-rank tensor recovery, color image recovery and background modelling.

摘要

张量鲁棒主成分分析(TRPCA)旨在从其和中恢复低秩和稀疏成分,近年来引起了广泛关注。大多数现有的TRPCA方法分别采用张量核范数(TNN)和张量l范数作为低秩和稀疏成分的正则化项。然而,TNN对低秩张量L的每个奇异值一视同仁,而张量l范数以相同的强度收缩稀疏张量S的每个元素。已经表明,较大的奇异值通常对应于数据的突出信息,应该受到较少的惩罚。就绝对值而言,S中的大元素也是如此。在本文中,我们提出了一种双自动加权TRPCA(DATRPCA)方法。与以往仅对低秩张量使用预定义和手动设置权重的工作不同,DATRPCA自动且自适应地分配较小的权重,并同时对低秩张量的显著奇异值和稀疏张量的大元素施加较轻的惩罚。我们基于交替方向乘子法(ADMM)框架进一步开发了一种高效算法来实现DATRPCA。此外,我们还建立了所提算法的收敛性分析。合成数据和真实数据的结果都证明了DATRPCA在低秩张量恢复、彩色图像恢复和背景建模方面的有效性。

相似文献

1
Double Auto-Weighted Tensor Robust Principal Component Analysis.
IEEE Trans Image Process. 2023;32:5114-5125. doi: 10.1109/TIP.2023.3310331. Epub 2023 Sep 12.
2
Robust Low-Rank Tensor Recovery via Nonconvex Singular Value Minimization.
IEEE Trans Image Process. 2020 Sep 18;PP. doi: 10.1109/TIP.2020.3023798.
3
Tensor decomposition based on the potential low-rank and -shrinkage generalized threshold algorithm for analyzing cancer multiomics data.
J Bioinform Comput Biol. 2022 Apr;20(2):2250002. doi: 10.1142/S0219720022500020. Epub 2022 Feb 21.
4
Enhanced Tensor RPCA and its Application.
IEEE Trans Pattern Anal Mach Intell. 2021 Jun;43(6):2133-2140. doi: 10.1109/TPAMI.2020.3017672. Epub 2021 May 11.
5
Low-Tubal-Rank Plus Sparse Tensor Recovery With Prior Subspace Information.
IEEE Trans Pattern Anal Mach Intell. 2021 Oct;43(10):3492-3507. doi: 10.1109/TPAMI.2020.2986773. Epub 2021 Sep 2.
6
Improved robust tensor principal component analysis for accelerating dynamic MR imaging reconstruction.
Med Biol Eng Comput. 2020 Jul;58(7):1483-1498. doi: 10.1007/s11517-020-02161-5. Epub 2020 May 5.
7
Tensor Robust Principal Component Analysis with a New Tensor Nuclear Norm.
IEEE Trans Pattern Anal Mach Intell. 2020 Apr;42(4):925-938. doi: 10.1109/TPAMI.2019.2891760. Epub 2019 Jan 9.
8
Tensor Robust Kernel PCA for Multidimensional Data.
IEEE Trans Neural Netw Learn Syst. 2025 Feb;36(2):2662-2674. doi: 10.1109/TNNLS.2024.3356228. Epub 2025 Feb 6.
9
Logarithmic Schatten- p Norm Minimization for Tensorial Multi-View Subspace Clustering.
IEEE Trans Pattern Anal Mach Intell. 2023 Mar;45(3):3396-3410. doi: 10.1109/TPAMI.2022.3179556. Epub 2023 Feb 3.
10
Color Image Restoration Using Sub-Image Based Low-Rank Tensor Completion.
Sensors (Basel). 2023 Feb 3;23(3):1706. doi: 10.3390/s23031706.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验