• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

SVRG-MKL:一种用于多类分类问题中特征组合的快速且可扩展的多核学习解决方案。

SVRG-MKL: A Fast and Scalable Multiple Kernel Learning Solution for Features Combination in Multi-Class Classification Problems.

作者信息

Alioscha-Perez Mitchel, Oveneke Meshia Cedric, Sahli Hichem

出版信息

IEEE Trans Neural Netw Learn Syst. 2020 May;31(5):1710-1723. doi: 10.1109/TNNLS.2019.2922123. Epub 2019 Jul 4.

DOI:10.1109/TNNLS.2019.2922123
PMID:31283489
Abstract

In this paper, we present a novel strategy to combine a set of compact descriptors to leverage an associated recognition task. We formulate the problem from a multiple kernel learning (MKL) perspective and solve it following a stochastic variance reduced gradient (SVRG) approach to address its scalability, currently an open issue. MKL models are ideal candidates to jointly learn the optimal combination of features along with its associated predictor. However, they are unable to scale beyond a dozen thousand of samples due to high computational and memory requirements, which severely limits their applicability. We propose SVRG-MKL, an MKL solution with inherent scalability properties that can optimally combine multiple descriptors involving millions of samples. Our solution takes place directly in the primal to avoid Gram matrices computation and memory allocation, whereas the optimization is performed with a proposed algorithm of linear complexity and hence computationally efficient. Our proposition builds upon recent progress in SVRG with the distinction that each kernel is treated differently during optimization, which results in a faster convergence than applying off-the-shelf SVRG into MKL. Extensive experimental validation conducted on several benchmarking data sets confirms a higher accuracy and a significant speedup of our solution. Our technique can be extended to other MKL problems, including visual search and transfer learning, as well as other formulations, such as group-sensitive (GMKL) and localized MKL (LMKL) in convex settings.

摘要

在本文中,我们提出了一种新颖的策略,将一组紧凑描述符相结合以利用相关的识别任务。我们从多核学习(MKL)的角度来阐述该问题,并采用随机方差减少梯度(SVRG)方法来解决它,以应对其可扩展性这一当前的开放性问题。MKL模型是联合学习特征及其相关预测器的最优组合的理想候选者。然而,由于高计算和内存需求,它们无法处理超过一万个样本,这严重限制了它们的适用性。我们提出了SVRG - MKL,一种具有固有可扩展性的MKL解决方案,它可以最优地组合涉及数百万个样本的多个描述符。我们的解决方案直接在原始空间中进行,以避免计算Gram矩阵和内存分配,而优化是通过一种具有线性复杂度的算法进行的,因此计算效率高。我们的提议基于SVRG的最新进展,不同之处在于在优化过程中对每个核进行不同的处理,这导致比将现成的SVRG应用于MKL更快的收敛速度。在几个基准数据集上进行的广泛实验验证证实了我们的解决方案具有更高的准确性和显著的加速效果。我们的技术可以扩展到其他MKL问题,包括视觉搜索和迁移学习,以及其他形式,如凸设置下的组敏感(GMKL)和局部MKL(LMKL)。

相似文献

1
SVRG-MKL: A Fast and Scalable Multiple Kernel Learning Solution for Features Combination in Multi-Class Classification Problems.SVRG-MKL:一种用于多类分类问题中特征组合的快速且可扩展的多核学习解决方案。
IEEE Trans Neural Netw Learn Syst. 2020 May;31(5):1710-1723. doi: 10.1109/TNNLS.2019.2922123. Epub 2019 Jul 4.
2
Efficient sparse generalized multiple kernel learning.高效稀疏广义多核学习
IEEE Trans Neural Netw. 2011 Mar;22(3):433-46. doi: 10.1109/TNN.2010.2103571. Epub 2011 Jan 20.
3
Multiple Kernel Learning for Visual Object Recognition: A Review.多核学习在视觉目标识别中的应用综述
IEEE Trans Pattern Anal Mach Intell. 2014 Jul;36(7):1354-69. doi: 10.1109/TPAMI.2013.212.
4
L2-norm multiple kernel learning and its application to biomedical data fusion.L2-范数多核学习及其在生物医学数据融合中的应用。
BMC Bioinformatics. 2010 Jun 8;11:309. doi: 10.1186/1471-2105-11-309.
5
A Multiple Kernel Learning Model Based on -Norm.基于范数的多核学习模型。
Comput Intell Neurosci. 2018 Jan 23;2018:1018789. doi: 10.1155/2018/1018789. eCollection 2018.
6
Generalized multiple kernel learning with data-dependent priors.基于数据相关先验的广义多核学习。
IEEE Trans Neural Netw Learn Syst. 2015 Jun;26(6):1134-48. doi: 10.1109/TNNLS.2014.2334137. Epub 2014 Jul 25.
7
Efficient Multiple Kernel Learning Algorithms Using Low-Rank Representation.基于低秩表示的高效多核学习算法
Comput Intell Neurosci. 2017;2017:3678487. doi: 10.1155/2017/3678487. Epub 2017 Aug 22.
8
Reduced multiple empirical kernel learning machine.简化的多重经验核学习机
Cogn Neurodyn. 2015 Feb;9(1):63-73. doi: 10.1007/s11571-014-9304-2. Epub 2014 Jul 29.
9
Soft margin multiple kernel learning.软间隔多内核学习。
IEEE Trans Neural Netw Learn Syst. 2013 May;24(5):749-61. doi: 10.1109/TNNLS.2012.2237183.
10
Absent Multiple Kernel Learning Algorithms.缺失的多核学习算法。
IEEE Trans Pattern Anal Mach Intell. 2020 Jun;42(6):1303-1316. doi: 10.1109/TPAMI.2019.2895608. Epub 2019 Jan 28.

引用本文的文献

1
Multi-Nyström Method Based on Multiple Kernel Learning for Large Scale Imbalanced Classification.基于多核学习的多奈斯特罗姆方法用于大规模不平衡分类
Comput Intell Neurosci. 2021 Jun 13;2021:9911871. doi: 10.1155/2021/9911871. eCollection 2021.
2
MKL-GRNI: A parallel multiple kernel learning approach for supervised inference of large-scale gene regulatory networks.MKL-GRNI:一种用于大规模基因调控网络监督推理的并行多核学习方法。
PeerJ Comput Sci. 2021 Jan 28;7:e363. doi: 10.7717/peerj-cs.363. eCollection 2021.