• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用二阶锥规划的高效超核学习

Efficient hyperkernel learning using second-order cone programming.

作者信息

Tsang Ivor Wai-hung, Kwok James Tin-yau

机构信息

Department of Computer Science, The Hong Kong University of Science and Technology, Clear Water Bay, Hong Kong.

出版信息

IEEE Trans Neural Netw. 2006 Jan;17(1):48-58. doi: 10.1109/TNN.2005.860848.

DOI:10.1109/TNN.2005.860848
PMID:16526475
Abstract

The kernel function plays a central role in kernel methods. Most existing methods can only adapt the kernel parameters or the kernel matrix based on empirical data. Recently, Ong et al. introduced the method of hyperkernels which can be used to learn the kernel function directly in an inductive setting. However, the associated optimization problem is a semidefinite program (SDP), which is very computationally expensive, even with the recent advances in interior point methods. In this paper, we show that this learning problem can be equivalently reformulated as a second-order cone program (SOCP), which can then be solved more efficiently than SDPs. Comparison is also made with the kernel matrix learning method proposed by Lanckriet et aL Experimental results on both classification and regression problems, with toy and real-world data sets, show that our proposed SOCP formulation has significant speedup over the original SDP formulation. Moreover, it yields better generalization than Lanckriet et al.'s method, with a speed that is comparable, or sometimes even faster, than their quadratically constrained quadratic program (QCQP) formulation.

摘要

核函数在核方法中起着核心作用。大多数现有方法只能根据经验数据调整核参数或核矩阵。最近,Ong等人引入了超核方法,该方法可用于在归纳设置中直接学习核函数。然而,相关的优化问题是一个半定规划(SDP),即使有内点法的最新进展,其计算成本也非常高。在本文中,我们表明这个学习问题可以等效地重新表述为二阶锥规划(SOCP),然后可以比SDP更有效地求解。我们还与Lanckriet等人提出的核矩阵学习方法进行了比较。使用玩具数据集和真实世界数据集进行的分类和回归问题的实验结果表明,我们提出的SOCP公式比原始SDP公式有显著的加速。此外,它比Lanckriet等人的方法具有更好的泛化能力,速度与他们的二次约束二次规划(QCQP)公式相当,有时甚至更快。

相似文献

1
Efficient hyperkernel learning using second-order cone programming.使用二阶锥规划的高效超核学习
IEEE Trans Neural Netw. 2006 Jan;17(1):48-58. doi: 10.1109/TNN.2005.860848.
2
Semisupervised kernel matrix learning by kernel propagation.基于核传播的半监督核矩阵学习
IEEE Trans Neural Netw. 2010 Nov;21(11):1831-41. doi: 10.1109/TNN.2010.2076301. Epub 2010 Oct 4.
3
Robust regularized kernel regression.稳健正则化核回归
IEEE Trans Syst Man Cybern B Cybern. 2008 Dec;38(6):1639-44. doi: 10.1109/TSMCB.2008.927279.
4
Estimation of positive semidefinite correlation matrices by using convex quadratic semidefinite programming.使用凸二次半定规划估计半正定相关矩阵
Neural Comput. 2009 Jul;21(7):2028-48. doi: 10.1162/neco.2009.04-08-765.
5
Fast Gaussian kernel learning for classification tasks based on specially structured global optimization.基于特殊结构全局优化的分类任务快速高斯核学习。
Neural Netw. 2014 Sep;57:51-62. doi: 10.1016/j.neunet.2014.05.014. Epub 2014 Jun 2.
6
Efficient dual approach to distance metric learning.高效的距离度量学习双重方法。
IEEE Trans Neural Netw Learn Syst. 2014 Feb;25(2):394-406. doi: 10.1109/TNNLS.2013.2275170.
7
Sparse Bayesian modeling with adaptive kernel learning.基于自适应核学习的稀疏贝叶斯建模
IEEE Trans Neural Netw. 2009 Jun;20(6):926-37. doi: 10.1109/TNN.2009.2014060. Epub 2009 May 5.
8
Fast protein classification with multiple networks.使用多个网络进行快速蛋白质分类。
Bioinformatics. 2005 Sep 1;21 Suppl 2:ii59-65. doi: 10.1093/bioinformatics/bti1110.
9
Design of a multiple kernel learning algorithm for LS-SVM by convex programming.基于凸规划的 LS-SVM 多核学习算法设计。
Neural Netw. 2011 Jun;24(5):476-83. doi: 10.1016/j.neunet.2011.03.009. Epub 2011 Mar 12.
10
Ideal regularization for learning kernels from labels.从标签中学习核函数的理想正则化方法。
Neural Netw. 2014 Aug;56:22-34. doi: 10.1016/j.neunet.2014.04.003. Epub 2014 May 2.