Mai Truong Thanh Nhat, Lam Edmund Y, Lee Chul
IEEE Trans Pattern Anal Mach Intell. 2024 Dec;46(12):9818-9833. doi: 10.1109/TPAMI.2024.3429498. Epub 2024 Nov 6.
Low-rank tensor completion (LRTC) aims to recover missing data of high-dimensional structures from a limited set of observed entries. Despite recent significant successes, the original structures of data tensors are still not effectively preserved in LRTC algorithms, yielding less accurate restoration results. Moreover, LRTC algorithms often incur high computational costs, which hinder their applicability. In this work, we propose an attention-guided low-rank tensor completion (AGTC) algorithm, which can faithfully restore the original structures of data tensors using deep unfolding attention-guided tensor factorization. First, we formulate the LRTC task as a robust factorization problem based on low-rank and sparse error assumptions. Low-rank tensor recovery is guided by an attention mechanism to better preserve the structures of the original data. We also develop implicit regularizers to compensate for modeling inaccuracies. Then, we solve the optimization problem by employing an iterative technique. Finally, we design a multistage deep network by unfolding the iterative algorithm, where each stage corresponds to an iteration of the algorithm; at each stage, the optimization variables and regularizers are updated by closed-form solutions and learned deep networks, respectively. Experimental results for high dynamic range imaging and hyperspectral image restoration show that the proposed algorithm outperforms state-of-the-art algorithms.
低秩张量补全(LRTC)旨在从一组有限的观测数据中恢复高维结构的缺失数据。尽管最近取得了重大成功,但在LRTC算法中,数据张量的原始结构仍未得到有效保留,导致恢复结果不够准确。此外,LRTC算法通常计算成本很高,这限制了它们的适用性。在这项工作中,我们提出了一种注意力引导的低秩张量补全(AGTC)算法,该算法可以通过深度展开注意力引导的张量分解忠实地恢复数据张量的原始结构。首先,我们基于低秩和稀疏误差假设,将LRTC任务表述为一个鲁棒的分解问题。低秩张量恢复由注意力机制引导,以更好地保留原始数据的结构。我们还开发了隐式正则化器来补偿建模误差。然后,我们采用迭代技术来解决优化问题。最后,我们通过展开迭代算法设计了一个多阶段深度网络,其中每个阶段对应于算法的一次迭代;在每个阶段,优化变量和正则化器分别通过闭式解和学习到的深度网络进行更新。高动态范围成像和高光谱图像恢复的实验结果表明,所提出的算法优于现有算法。