Suppr超能文献

基于多核的主动学习

Active Learning With Multiple Kernels.

作者信息

Hong Songnam, Chae Jeongmin

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):2980-2994. doi: 10.1109/TNNLS.2020.3047953. Epub 2022 Jul 6.

Abstract

Online multiple kernel learning (OMKL) has provided an attractive performance in nonlinear function learning tasks. Leveraging a random feature (RF) approximation, the major drawback of OMKL, known as the curse of dimensionality, has been recently alleviated. These advantages enable RF-based OMKL to be considered in practice. In this article, we introduce a new research problem, named stream-based active MKL (AMKL), in which a learner is allowed to label some selected data from an oracle according to a selection criterion. This is necessary for many real-world applications as acquiring a true label is costly or time consuming. We theoretically prove that the proposed AMKL achieves an optimal sublinear regret O(√T) as in OMKL with little labeled data, implying that the proposed selection criterion indeed avoids unnecessary label requests. Furthermore, we present AMKL with an adaptive kernel selection (named AMKL-AKS) in which irrelevant kernels can be excluded from a kernel dictionary "on the fly." This approach improves the efficiency of active learning and the accuracy of function learning. Via numerical tests with real data sets, we verify the superiority of AMKL-AKS, yielding a similar accuracy performance with OMKL counterpart using a fewer number of labeled data.

摘要

在线多核学习(OMKL)在非线性函数学习任务中展现出了诱人的性能。利用随机特征(RF)近似,OMKL 中被称为维度诅咒的主要缺点最近得到了缓解。这些优势使得基于 RF 的 OMKL 在实际中得以被考虑。在本文中,我们引入了一个新的研究问题,即基于流的主动多核学习(AMKL),其中学习者被允许根据选择标准从神谕中标记一些选定的数据。这对于许多实际应用来说是必要的,因为获取真实标签成本高昂或耗时。我们从理论上证明,所提出的 AMKL 在少量标记数据的情况下与 OMKL 一样实现了最优的次线性遗憾 O(√T),这意味着所提出的选择标准确实避免了不必要的标签请求。此外,我们提出了一种具有自适应核选择的 AMKL(称为 AMKL-AKS),其中可以“即时”从核字典中排除不相关的核。这种方法提高了主动学习的效率和函数学习的准确性。通过对真实数据集的数值测试,我们验证了 AMKL-AKS 的优越性,在使用更少数量的标记数据时,其精度性能与 OMKL 相当。

相似文献

1
Active Learning With Multiple Kernels.基于多核的主动学习
IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):2980-2994. doi: 10.1109/TNNLS.2020.3047953. Epub 2022 Jul 6.
3
Distributed Online Learning With Multiple Kernels.基于多核的分布式在线学习
IEEE Trans Neural Netw Learn Syst. 2023 Mar;34(3):1263-1277. doi: 10.1109/TNNLS.2021.3105146. Epub 2023 Feb 28.
4
Online Multikernel Learning Method via Online Biconvex Optimization.通过在线双凸优化的在线多核学习方法
IEEE Trans Neural Netw Learn Syst. 2024 Nov;35(11):16630-16643. doi: 10.1109/TNNLS.2023.3296895. Epub 2024 Oct 29.
5
Communication-Efficient Randomized Algorithm for Multi-Kernel Online Federated Learning.多内核在线联邦学习的高效通信随机算法。
IEEE Trans Pattern Anal Mach Intell. 2022 Dec;44(12):9872-9886. doi: 10.1109/TPAMI.2021.3129809. Epub 2022 Nov 7.
7
An Adaptive Approach to Learning Optimal Neighborhood Kernels.自适应学习最优邻域核方法
IEEE Trans Cybern. 2013 Feb;43(1):371-84. doi: 10.1109/TSMCB.2012.2207889. Epub 2012 Jul 30.
9
Reduced multiple empirical kernel learning machine.简化的多重经验核学习机
Cogn Neurodyn. 2015 Feb;9(1):63-73. doi: 10.1007/s11571-014-9304-2. Epub 2014 Jul 29.
10
Incremental Ensemble Gaussian Processes.增量集成高斯过程
IEEE Trans Pattern Anal Mach Intell. 2023 Feb;45(2):1876-1893. doi: 10.1109/TPAMI.2022.3157197. Epub 2023 Jan 6.

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验