Hou Jingyao, Zhang Feng, Qiu Haiquan, Wang Jianjun, Wang Yao, Meng Deyu
IEEE Trans Pattern Anal Mach Intell. 2022 Aug;44(8):4355-4373. doi: 10.1109/TPAMI.2021.3063527. Epub 2022 Jul 1.
Low-rank tensor recovery (LRTR) is a natural extension of low-rank matrix recovery (LRMR) to high-dimensional arrays, which aims to reconstruct an underlying tensor X from incomplete linear measurements [Formula: see text]. However, LRTR ignores the error caused by quantization, limiting its application when the quantization is low-level. In this work, we take into account the impact of extreme quantization and suppose the quantizer degrades into a comparator that only acquires the signs of [Formula: see text]. We still hope to recover X from these binary measurements. Under the tensor Singular Value Decomposition (t-SVD) framework, two recovery methods are proposed-the first is a tensor hard singular tube thresholding method; the second is a constrained tensor nuclear norm minimization method. These methods can recover a real n×n×n tensor X with tubal rank r from m random Gaussian binary measurements with errors decaying at a polynomial speed of the oversampling factor λ:=m/((n+n)nr). To improve the convergence rate, we develop a new quantization scheme under which the convergence rate can be accelerated to an exponential function of λ. Numerical experiments verify our results, and the applications to real-world data demonstrate the promising performance of the proposed methods.
低秩张量恢复(LRTR)是低秩矩阵恢复(LRMR)向高维数组的自然扩展,其目的是从不完全线性测量[公式:见正文]中重建基础张量X。然而,LRTR忽略了量化引起的误差,限制了其在低量化水平时的应用。在这项工作中,我们考虑了极端量化的影响,并假设量化器退化为仅获取[公式:见正文]符号的比较器。我们仍然希望从这些二元测量中恢复X。在张量奇异值分解(t-SVD)框架下,提出了两种恢复方法——第一种是张量硬奇异管阈值化方法;第二种是约束张量核范数最小化方法。这些方法可以从m个随机高斯二元测量中恢复管秩为r的实n×n×n张量X,误差以过采样因子λ:=m/((n + n)nr)的多项式速度衰减。为了提高收敛速度,我们开发了一种新的量化方案,在该方案下收敛速度可以加速为λ的指数函数。数值实验验证了我们的结果,并且在实际数据中的应用证明了所提出方法的良好性能。