Chen Yongyong, Wang Shuqin, Peng Chong, Hua Zhongyun, Zhou Yicong
IEEE Trans Image Process. 2021;30:4022-4035. doi: 10.1109/TIP.2021.3068646. Epub 2021 Apr 5.
The low-rank tensor representation (LRTR) has become an emerging research direction to boost the multi-view clustering performance. This is because LRTR utilizes not only the pairwise relation between data points, but also the view relation of multiple views. However, there is one significant challenge: LRTR uses the tensor nuclear norm as the convex approximation but provides a biased estimation of the tensor rank function. To address this limitation, we propose the generalized nonconvex low-rank tensor approximation (GNLTA) for multi-view subspace clustering. Instead of the pairwise correlation, GNLTA adopts the low-rank tensor approximation to capture the high-order correlation among multiple views and proposes the generalized nonconvex low-rank tensor norm to well consider the physical meanings of different singular values. We develop a unified solver to solve the GNLTA model and prove that under mild conditions, any accumulation point is a stationary point of GNLTA. Extensive experiments on seven commonly used benchmark databases have demonstrated that the proposed GNLTA achieves better clustering performance over state-of-the-art methods.
低秩张量表示(LRTR)已成为提升多视图聚类性能的一个新兴研究方向。这是因为LRTR不仅利用了数据点之间的成对关系,还利用了多个视图的视图关系。然而,存在一个重大挑战:LRTR使用张量核范数作为凸近似,但对张量秩函数提供了有偏差的估计。为了解决这一局限性,我们提出了用于多视图子空间聚类的广义非凸低秩张量近似(GNLTA)。GNLTA不是采用成对相关性,而是采用低秩张量近似来捕捉多个视图之间的高阶相关性,并提出了广义非凸低秩张量范数,以充分考虑不同奇异值的物理意义。我们开发了一个统一的求解器来求解GNLTA模型,并证明在温和条件下,任何聚点都是GNLTA的驻点。在七个常用基准数据库上进行的大量实验表明,所提出的GNLTA比现有方法具有更好的聚类性能。