• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于协方差的回归学习的新见解。

New Insights Into Learning With Correntropy-Based Regression.

机构信息

Department of Mathematics and Statistics, State University of New York at Albany, Albany, NY 12222, U.S.A.

出版信息

Neural Comput. 2021 Jan;33(1):157-173. doi: 10.1162/neco_a_01334. Epub 2020 Oct 20.

DOI:10.1162/neco_a_01334
PMID:33080165
Abstract

Stemming from information-theoretic learning, the correntropy criterion and its applications to machine learning tasks have been extensively studied and explored. Its application to regression problems leads to the robustness-enhanced regression paradigm: correntropy-based regression. Having drawn a great variety of successful real-world applications, its theoretical properties have also been investigated recently in a series of studies from a statistical learning viewpoint. The resulting big picture is that correntropy-based regression regresses toward the conditional mode function or the conditional mean function robustly under certain conditions. Continuing this trend and going further, in this study, we report some new insights into this problem. First, we show that under the additive noise regression model, such a regression paradigm can be deduced from minimum distance estimation, implying that the resulting estimator is essentially a minimum distance estimator and thus possesses robustness properties. Second, we show that the regression paradigm in fact provides a unified approach to regression problems in that it approaches the conditional mean, the conditional mode, and the conditional median functions under certain conditions. Third, we present some new results when it is used to learn the conditional mean function by developing its error bounds and exponential convergence rates under conditional ()-moment assumptions. The saturation effect on the established convergence rates, which was observed under ()-moment assumptions, still occurs, indicating the inherent bias of the regression estimator. These novel insights deepen our understanding of correntropy-based regression, help cement the theoretic correntropy framework, and enable us to investigate learning schemes induced by general bounded nonconvex loss functions.

摘要

从信息论学习的角度来看,相关熵准则及其在机器学习任务中的应用已经得到了广泛的研究和探索。它在回归问题中的应用导致了稳健性增强的回归范例:基于相关熵的回归。它已经在各种各样的成功的现实世界的应用中得到了应用,其理论性质最近也从统计学习的角度在一系列研究中得到了研究。由此产生的总体情况是,在某些条件下,基于相关熵的回归朝着条件模式函数或条件均值函数稳健地回归。延续这一趋势,在本研究中,我们报告了对这个问题的一些新的见解。首先,我们表明,在加性噪声回归模型下,这种回归范例可以从最小距离估计中推断出来,这意味着所得到的估计器本质上是一个最小距离估计器,因此具有稳健性。其次,我们表明,该回归范例实际上为回归问题提供了一种统一的方法,因为它在某些条件下接近条件均值、条件模式和条件中位数函数。第三,当它被用于通过开发条件 ()-矩假设下的误差界和指数收敛速度来学习条件均值函数时,我们给出了一些新的结果。在 ()-矩假设下观察到的建立的收敛速度的饱和效应仍然存在,表明回归估计器固有的偏差。这些新的见解加深了我们对基于相关熵的回归的理解,有助于巩固理论相关熵框架,并使我们能够研究由一般有界非凸损失函数诱导的学习方案。

相似文献

1
New Insights Into Learning With Correntropy-Based Regression.基于协方差的回归学习的新见解。
Neural Comput. 2021 Jan;33(1):157-173. doi: 10.1162/neco_a_01334. Epub 2020 Oct 20.
2
Generalization analysis of deep CNNs under maximum correntropy criterion.最大相关熵准则下深度卷积神经网络的泛化分析。
Neural Netw. 2024 Jun;174:106226. doi: 10.1016/j.neunet.2024.106226. Epub 2024 Mar 5.
3
Robust Variable Selection and Estimation Based on Kernel Modal Regression.基于核模态回归的稳健变量选择与估计
Entropy (Basel). 2019 Apr 16;21(4):403. doi: 10.3390/e21040403.
4
Broad learning system based on maximum multi-kernel correntropy criterion.基于最大多核关联准则的广义学习系统。
Neural Netw. 2024 Nov;179:106521. doi: 10.1016/j.neunet.2024.106521. Epub 2024 Jul 8.
5
Online Gradient Descent for Kernel-Based Maximum Correntropy Criterion.基于核最大相关熵准则的在线梯度下降法
Entropy (Basel). 2019 Jun 29;21(7):644. doi: 10.3390/e21070644.
6
Multikernel Correntropy for Robust Learning.多核相关熵的稳健学习。
IEEE Trans Cybern. 2022 Dec;52(12):13500-13511. doi: 10.1109/TCYB.2021.3110732. Epub 2022 Nov 18.
7
Information Theoretic Subspace Clustering.信息论子空间聚类。
IEEE Trans Neural Netw Learn Syst. 2016 Dec;27(12):2643-2655. doi: 10.1109/TNNLS.2015.2500600. Epub 2015 Dec 1.
8
Learning Korobov Functions by Correntropy and Convolutional Neural Networks.通过核相关熵和卷积神经网络学习科罗博夫函数
Neural Comput. 2024 Mar 21;36(4):718-743. doi: 10.1162/neco_a_01650.
9
A Framework of Learning Through Empirical Gain Maximization.一种通过经验增益最大化进行学习的框架。
Neural Comput. 2021 May 13;33(6):1656-1697. doi: 10.1162/neco_a_01384.
10
Broad Learning System Based on Maximum Correntropy Criterion.基于最大互信息准则的广义学习系统
IEEE Trans Neural Netw Learn Syst. 2021 Jul;32(7):3083-3097. doi: 10.1109/TNNLS.2020.3009417. Epub 2021 Jul 6.

引用本文的文献

1
Stochastic Gradient Descent for Kernel-Based Maximum Correntropy Criterion.基于核最大相关熵准则的随机梯度下降
Entropy (Basel). 2024 Dec 17;26(12):1104. doi: 10.3390/e26121104.