Zhou Jiayu, Chen Jianhui, Ye Jieping
Computer Science and Engineering, Arizona State University, Tempe, AZ 85287.
Adv Neural Inf Process Syst. 2011;2011:702-710.
Multi-task learning (MTL) learns multiple related tasks simultaneously to improve generalization performance. Alternating structure optimization (ASO) is a popular MTL method that learns a shared low-dimensional predictive structure on hypothesis spaces from multiple related tasks. It has been applied successfully in many real world applications. As an alternative MTL approach, clustered multi-task learning (CMTL) assumes that multiple tasks follow a clustered structure, i.e., tasks are partitioned into a set of groups where tasks in the same group are similar to each other, and that such a clustered structure is unknown a priori. The objectives in ASO and CMTL differ in how multiple tasks are related. Interestingly, we show in this paper the equivalence relationship between ASO and CMTL, providing significant new insights into ASO and CMTL as well as their inherent relationship. The CMTL formulation is non-convex, and we adopt a convex relaxation to the CMTL formulation. We further establish the equivalence relationship between the proposed convex relaxation of CMTL and an existing convex relaxation of ASO, and show that the proposed convex CMTL formulation is significantly more efficient especially for high-dimensional data. In addition, we present three algorithms for solving the convex CMTL formulation. We report experimental results on benchmark datasets to demonstrate the efficiency of the proposed algorithms.
多任务学习(MTL)通过同时学习多个相关任务来提高泛化性能。交替结构优化(ASO)是一种流行的多任务学习方法,它从多个相关任务的假设空间中学习共享的低维预测结构。它已在许多实际应用中成功应用。作为一种替代的多任务学习方法,聚类多任务学习(CMTL)假设多个任务遵循聚类结构,即任务被划分为一组组,同一组中的任务彼此相似,并且这种聚类结构是先验未知的。ASO和CMTL中的目标在多个任务如何相关方面有所不同。有趣的是,我们在本文中展示了ASO和CMTL之间的等价关系,为ASO和CMTL及其内在关系提供了重要的新见解。CMTL公式是非凸的,我们对CMTL公式采用凸松弛。我们进一步建立了所提出的CMTL凸松弛与现有的ASO凸松弛之间的等价关系,并表明所提出的凸CMTL公式效率显著更高,特别是对于高维数据。此外,我们提出了三种用于求解凸CMTL公式的算法。我们在基准数据集上报告实验结果以证明所提出算法的效率。