Suppr超能文献

基于秩积正则化的多线性多任务学习

Multilinear Multitask Learning by Rank-Product Regularization.

作者信息

Zhao Qian, Rui Xiangyu, Han Zhi, Meng Deyu

出版信息

IEEE Trans Neural Netw Learn Syst. 2020 Apr;31(4):1336-1350. doi: 10.1109/TNNLS.2019.2919774. Epub 2019 Jun 24.

Abstract

Multilinear multitask learning (MLMTL) considers an MTL problem in which tasks are arranged by multiple indices. By exploiting the higher order correlations among the tasks, MLMTL is expected to improve the performance of traditional MTL, which only considers the first-order correlation across all tasks, e.g., low-rank structure of the coefficient matrix. The key to MLMTL is designing a rational regularization term to represent the latent correlation structure underlying the coefficient tensor instead of matrix. In this paper, we propose a new MLMTL model by employing the rank-product regularization term in the objective, which on one hand can automatically rectify the weights along all its tensor modes and on the other hand have an explicit physical meaning. By using this regularization, the intrinsic high-order correlations among tasks can be more precisely described, and thus, the overall performance of all tasks can be improved. To solve the resulted optimization model, we design an efficient algorithm by applying the alternating direction method of multipliers (ADMM). We also analyze the convergence and show that the proposed algorithm, with certain restriction, is asymptotically regular. Experiments on both synthetic and real data sets substantiate the superiority of the proposed method beyond the existing MLMTL methods in terms of accuracy and efficiency.

摘要

多线性多任务学习(MLMTL)考虑了一种多任务学习(MTL)问题,其中任务由多个指标进行排列。通过利用任务之间的高阶相关性,预计MLMTL能够提升传统MTL的性能,传统MTL仅考虑所有任务之间的一阶相关性,例如系数矩阵的低秩结构。MLMTL的关键在于设计一个合理的正则化项,以表示系数张量而非矩阵背后的潜在相关结构。在本文中,我们通过在目标函数中采用秩积正则化项提出了一种新的MLMTL模型,该正则化项一方面可以自动调整其所有张量模式上的权重,另一方面具有明确的物理意义。通过使用这种正则化,可以更精确地描述任务之间固有的高阶相关性,从而提高所有任务的整体性能。为了解决由此产生的优化模型,我们应用乘子交替方向法(ADMM)设计了一种高效算法。我们还分析了收敛性,并表明所提出的算法在一定限制下是渐近正则的。在合成数据集和真实数据集上的实验证实了所提方法在准确性和效率方面优于现有的MLMTL方法。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验