Suppr超能文献

具有灵活流形约束的鲁棒多任务学习

Robust Multi-Task Learning With Flexible Manifold Constraint.

作者信息

Zhang Rui, Zhang Hongyuan, Li Xuelong

出版信息

IEEE Trans Pattern Anal Mach Intell. 2021 Jun;43(6):2150-2157. doi: 10.1109/TPAMI.2020.3007637. Epub 2021 May 11.

Abstract

Multi-Task Learning attempts to explore and mine the sufficient information within multiple related tasks for the better solutions. However, the performance of the existing multi-task approaches would largely degenerate when dealing with the polluted data, i.e., outliers. In this paper, we propose a novel robust multi-task model by incorporating a flexible manifold constraint (FMC-MTL) and a robust loss. Specifically speaking, multi-task subspace is embedded with a relaxed and generalized Stiefel Manifold for considering point-wise correlation and preserving the data structure simultaneously. In addition, a robust loss function is developed to ensure the robustness to outliers by smoothly interpolating between l-norm and squared Frobenius norm. Equipped with an efficient algorithm, FMC-MTL serves as a robust solution to tackling the severely polluted data. Moreover, extensive experiments are conducted to verify the superiority of our model. Compared to the state-of-the-art multi-task models, the proposed FMC-MTL model demonstrates remarkable robustness to the contaminated data.

摘要

多任务学习试图探索和挖掘多个相关任务中的充分信息以获得更好的解决方案。然而,现有的多任务方法在处理受污染的数据(即离群值)时,其性能会大幅退化。在本文中,我们通过结合灵活的流形约束(FMC-MTL)和鲁棒损失提出了一种新颖的鲁棒多任务模型。具体而言,多任务子空间嵌入了一个松弛且广义的正交矩阵流形,以便同时考虑逐点相关性并保留数据结构。此外,还开发了一种鲁棒损失函数,通过在 l-范数和平方弗罗贝尼乌斯范数之间进行平滑插值来确保对离群值的鲁棒性。配备了一种高效算法后,FMC-MTL 成为处理严重受污染数据的鲁棒解决方案。此外,还进行了广泛的实验以验证我们模型的优越性。与最先进的多任务模型相比,所提出的 FMC-MTL 模型对受污染数据表现出显著的鲁棒性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验