Suppr超能文献

概率低秩多任务学习。

Probabilistic Low-Rank Multitask Learning.

出版信息

IEEE Trans Neural Netw Learn Syst. 2018 Mar;29(3):670-680. doi: 10.1109/TNNLS.2016.2641160. Epub 2017 Jan 4.

Abstract

In this paper, we consider the problem of learning multiple related tasks simultaneously with the goal of improving the generalization performance of individual tasks. The key challenge is to effectively exploit the shared information across multiple tasks as well as preserve the discriminative information for each individual task. To address this, we propose a novel probabilistic model for multitask learning (MTL) that can automatically balance between low-rank and sparsity constraints. The former assumes a low-rank structure of the underlying predictive hypothesis space to explicitly capture the relationship of different tasks and the latter learns the incoherent sparse patterns private to each task. We derive and perform inference via variational Bayesian methods. Experimental results on both regression and classification tasks on real-world applications demonstrate the effectiveness of the proposed method in dealing with the MTL problems.

摘要

在本文中,我们考虑了同时学习多个相关任务的问题,目的是提高各个任务的泛化性能。关键的挑战是有效地利用多个任务之间的共享信息,同时保留每个任务的判别信息。为了解决这个问题,我们提出了一种新的用于多任务学习(MTL)的概率模型,该模型可以自动平衡低秩和稀疏约束。前者假设底层预测假设空间具有低秩结构,以显式捕获不同任务之间的关系,后者学习每个任务特有的不相关稀疏模式。我们通过变分贝叶斯方法进行推断和推理。在真实应用中的回归和分类任务上的实验结果表明,该方法在处理 MTL 问题方面的有效性。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验