Suppr超能文献

域迁移多核学习。

Domain transfer multiple kernel learning.

机构信息

Nanyang Technological University, Nanyang Avenue, Singapore 639798.

出版信息

IEEE Trans Pattern Anal Mach Intell. 2012 Mar;34(3):465-79. doi: 10.1109/TPAMI.2011.114.

Abstract

Cross-domain learning methods have shown promising results by leveraging labeled patterns from the auxiliary domain to learn a robust classifier for the target domain which has only a limited number of labeled samples. To cope with the considerable change between feature distributions of different domains, we propose a new cross-domain kernel learning framework into which many existing kernel methods can be readily incorporated. Our framework, referred to as Domain Transfer Multiple Kernel Learning (DTMKL), simultaneously learns a kernel function and a robust classifier by minimizing both the structural risk functional and the distribution mismatch between the labeled and unlabeled samples from the auxiliary and target domains. Under the DTMKL framework, we also propose two novel methods by using SVM and prelearned classifiers, respectively. Comprehensive experiments on three domain adaptation data sets (i.e., TRECVID, 20 Newsgroups, and email spam data sets) demonstrate that DTMKL-based methods outperform existing cross-domain learning and multiple kernel learning methods.

摘要

跨领域学习方法通过利用辅助领域的有标签模式,展示了有前景的结果,以便为目标领域学习一个稳健的分类器,而目标领域仅有数量有限的带标签样本。为了应对不同领域特征分布之间的显著变化,我们提出了一种新的跨领域核学习框架,可以很容易地将许多现有的核方法纳入其中。我们的框架,称为领域转移多核学习(DTMKL),通过最小化辅助领域和目标领域的有标签和无标签样本之间的结构风险函数和分布不匹配,同时学习核函数和稳健的分类器。在 DTMKL 框架下,我们还分别使用 SVM 和预学习分类器提出了两种新方法。在三个领域自适应数据集(即 TRECVID、20 个新闻组和电子邮件垃圾邮件数据集)上的综合实验表明,基于 DTMKL 的方法优于现有的跨领域学习和多核学习方法。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验