Suppr超能文献

张量分解的低秩张量补全。

Tensor Factorization for Low-Rank Tensor Completion.

出版信息

IEEE Trans Image Process. 2018 Mar;27(3):1152-1163. doi: 10.1109/TIP.2017.2762595. Epub 2017 Oct 12.

Abstract

Recently, a tensor nuclear norm (TNN) based method was proposed to solve the tensor completion problem, which has achieved state-of-the-art performance on image and video inpainting tasks. However, it requires computing tensor singular value decomposition (t-SVD), which costs much computation and thus cannot efficiently handle tensor data, due to its natural large scale. Motivated by TNN, we propose a novel low-rank tensor factorization method for efficiently solving the 3-way tensor completion problem. Our method preserves the low-rank structure of a tensor by factorizing it into the product of two tensors of smaller sizes. In the optimization process, our method only needs to update two smaller tensors, which can be more efficiently conducted than computing t-SVD. Furthermore, we prove that the proposed alternating minimization algorithm can converge to a Karush-Kuhn-Tucker point. Experimental results on the synthetic data recovery, image and video inpainting tasks clearly demonstrate the superior performance and efficiency of our developed method over state-of-the-arts including the TNN and matricization methods.

摘要

最近,提出了一种基于张量核范数(TNN)的方法来解决张量补全问题,该方法在图像和视频修复任务中取得了最先进的性能。然而,由于其天然的大规模,它需要计算张量奇异值分解(t-SVD),这需要大量的计算,因此不能有效地处理张量数据。受 TNN 的启发,我们提出了一种新的低秩张量分解方法,用于有效地解决 3 路张量补全问题。我们的方法通过将张量分解为两个较小大小的张量的乘积来保持张量的低秩结构。在优化过程中,我们的方法只需要更新两个较小的张量,这比计算 t-SVD 更有效。此外,我们证明了所提出的交替最小化算法可以收敛到 Karush-Kuhn-Tucker 点。在合成数据恢复、图像和视频修复任务上的实验结果清楚地表明,我们开发的方法在性能和效率方面明显优于 TNN 和矩阵化方法等最先进的方法。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验