• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于凸损失和不定核的学习。

Learning with convex loss and indefinite kernels.

机构信息

School of Statistics, University of International Business and Economics, Beijing 100029, P.R.C.

出版信息

Neural Comput. 2014 Jan;26(1):158-84. doi: 10.1162/NECO_a_00535. Epub 2013 Oct 8.

DOI:10.1162/NECO_a_00535
PMID:24102124
Abstract

We consider a kind of kernel-based regression with general convex loss functions in a regularization scheme. The kernels used in the scheme are not necessarily symmetric and thus are not positive semidefinite; l(1)-norm of the coefficients in the kernel ensembles is taken as the regularizer. Our setting in this letter is quite different from the classical regularized regression algorithms such as regularized networks and support vector machines regression. Under an established error decomposition that consists of approximation error, hypothesis error, and sample error, we present a detailed mathematical analysis for this scheme and, in particular, its learning rate. A reweighted empirical process theory is applied to the analysis of produced learning algorithms, which plays a key role in deriving the explicit learning rate under some assumptions.

摘要

我们考虑了一种基于正则化方案中一般凸损失函数的核回归。方案中使用的核不必对称,因此不必是半正定的;核集合中系数的 l(1)-范数被用作正则化项。我们在这封信中的设置与经典正则化回归算法(如正则化网络和支持向量机回归)有很大的不同。在由逼近误差、假设误差和样本误差组成的已建立的误差分解的基础上,我们对该方案进行了详细的数学分析,特别是对其学习率进行了分析。重加权经验过程理论被应用于产生的学习算法的分析,这在某些假设下推导出显式学习率中起着关键作用。

相似文献

1
Learning with convex loss and indefinite kernels.基于凸损失和不定核的学习。
Neural Comput. 2014 Jan;26(1):158-84. doi: 10.1162/NECO_a_00535. Epub 2013 Oct 8.
2
On learning vector-valued functions.关于学习向量值函数。
Neural Comput. 2005 Jan;17(1):177-204. doi: 10.1162/0899766052530802.
3
Indefinite Kernel Logistic Regression With Concave-Inexact-Convex Procedure.采用凹-不精确-凸过程的不定核逻辑回归
IEEE Trans Neural Netw Learn Syst. 2019 Mar;30(3):765-776. doi: 10.1109/TNNLS.2018.2851305. Epub 2018 Jul 26.
4
Design of a multiple kernel learning algorithm for LS-SVM by convex programming.基于凸规划的 LS-SVM 多核学习算法设计。
Neural Netw. 2011 Jun;24(5):476-83. doi: 10.1016/j.neunet.2011.03.009. Epub 2011 Mar 12.
5
Framelet kernels with applications to support vector regression and regularization networks.具有支持向量回归和正则化网络应用的小框架核。
IEEE Trans Syst Man Cybern B Cybern. 2010 Aug;40(4):1128-44. doi: 10.1109/TSMCB.2009.2034993. Epub 2009 Dec 4.
6
Kernel discriminant analysis for positive definite and indefinite kernels.用于正定和不定核的核判别分析。
IEEE Trans Pattern Anal Mach Intell. 2009 Jun;31(6):1017-32. doi: 10.1109/TPAMI.2008.290.
7
Another look at statistical learning theory and regularization.再探统计学习理论与正则化。
Neural Netw. 2009 Sep;22(7):958-69. doi: 10.1016/j.neunet.2009.04.005. Epub 2009 Apr 22.
8
Analysis of fixed-point and coordinate descent algorithms for regularized kernel methods.正则化核方法的定点和坐标下降算法分析
IEEE Trans Neural Netw. 2011 Oct;22(10):1576-87. doi: 10.1109/TNN.2011.2164096. Epub 2011 Aug 18.
9
Gabor-based kernel PCA with fractional power polynomial models for face recognition.基于伽柏的核主成分分析与分数幂多项式模型用于人脸识别。
IEEE Trans Pattern Anal Mach Intell. 2004 May;26(5):572-81. doi: 10.1109/TPAMI.2004.1273927.
10
Rademacher chaos complexities for learning the kernel problem.Rademacher 混沌复杂度在核问题学习中的应用。
Neural Comput. 2010 Nov;22(11):2858-86. doi: 10.1162/NECO_a_00028.