Chen Jianhui, Liu Ji, Ye Jieping
Arizona State University.
ACM Trans Knowl Discov Data. 2012 Feb 1;5(4):22. doi: 10.1145/2086737.2086742.
We consider the problem of learning incoherent sparse and low-rank patterns from multiple tasks. Our approach is based on a linear multi-task learning formulation, in which the sparse and low-rank patterns are induced by a cardinality regularization term and a low-rank constraint, respectively. This formulation is non-convex; we convert it into its convex surrogate, which can be routinely solved via semidefinite programming for small-size problems. We propose to employ the general projected gradient scheme to efficiently solve such a convex surrogate; however, in the optimization formulation, the objective function is non-differentiable and the feasible domain is non-trivial. We present the procedures for computing the projected gradient and ensuring the global convergence of the projected gradient scheme. The computation of projected gradient involves a constrained optimization problem; we show that the optimal solution to such a problem can be obtained via solving an unconstrained optimization subproblem and an Euclidean projection subproblem. We also present two projected gradient algorithms and analyze their rates of convergence in details. In addition, we illustrate the use of the presented projected gradient algorithms for the proposed multi-task learning formulation using the least squares loss. Experimental results on a collection of real-world data sets demonstrate the effectiveness of the proposed multi-task learning formulation and the efficiency of the proposed projected gradient algorithms.
我们考虑从多个任务中学习非相干稀疏和低秩模式的问题。我们的方法基于线性多任务学习公式,其中稀疏和低秩模式分别由基数正则化项和低秩约束诱导。这个公式是非凸的;我们将其转换为凸替代形式,对于小规模问题可以通过半定规划常规求解。我们建议采用通用投影梯度法来有效求解这种凸替代形式;然而,在优化公式中,目标函数不可微且可行域不平凡。我们给出了计算投影梯度并确保投影梯度法全局收敛的过程。投影梯度的计算涉及一个约束优化问题;我们表明,通过求解一个无约束优化子问题和一个欧几里得投影子问题可以得到该问题的最优解。我们还给出了两种投影梯度算法,并详细分析了它们的收敛速度。此外,我们说明了使用所提出的投影梯度算法对使用最小二乘损失的所提出的多任务学习公式的应用。在一组真实世界数据集上的实验结果证明了所提出的多任务学习公式的有效性和所提出的投影梯度算法的效率。