• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于核最大相关熵准则的随机梯度下降

Stochastic Gradient Descent for Kernel-Based Maximum Correntropy Criterion.

作者信息

Li Tiankai, Wang Baobin, Peng Chaoquan, Yin Hong

机构信息

School of Mathematics and Statistics, South-Central MinZu University, Wuhan 430074, China.

School of Mathematics, Renmin University of China, Beijing 100872, China.

出版信息

Entropy (Basel). 2024 Dec 17;26(12):1104. doi: 10.3390/e26121104.

DOI:10.3390/e26121104
PMID:39766733
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11675914/
Abstract

Maximum correntropy criterion (MCC) has been an important method in machine learning and signal processing communities since it was successfully applied in various non-Gaussian noise scenarios. In comparison with the classical least squares method (LS), which takes only the second-order moment of models into consideration and belongs to the convex optimization problem, MCC captures the high-order information of models that play crucial roles in robust learning, which is usually accompanied by solving the non-convexity optimization problems. As we know, the theoretical research on convex optimizations has made significant achievements, while theoretical understandings of non-convex optimization are still far from mature. Motivated by the popularity of the stochastic gradient descent (SGD) for solving nonconvex problems, this paper considers SGD applied to the kernel version of MCC, which has been shown to be robust to outliers and non-Gaussian data in nonlinear structure models. As the existing theoretical results for the SGD algorithm applied to the kernel MCC are not well established, we present the rigorous analysis for the convergence behaviors and provide explicit convergence rates under some standard conditions. Our work can fill the gap between optimization process and convergence during the iterations: the iterates need to converge to the global minimizer while the obtained estimator cannot ensure the global optimality in the learning process.

摘要

自最大互信息准则(MCC)成功应用于各种非高斯噪声场景以来,它一直是机器学习和信号处理领域的一种重要方法。与仅考虑模型二阶矩且属于凸优化问题的经典最小二乘法(LS)相比,MCC能够捕捉在鲁棒学习中起关键作用的模型高阶信息,而这通常伴随着求解非凸优化问题。众所周知,凸优化的理论研究已取得显著成果,而非凸优化的理论理解仍远未成熟。受用于解决非凸问题的随机梯度下降(SGD)方法广泛应用的启发,本文考虑将SGD应用于MCC的核版本,该版本在非线性结构模型中已被证明对异常值和非高斯数据具有鲁棒性。由于应用于核MCC的SGD算法的现有理论结果尚未完善,我们对其收敛行为进行了严格分析,并在一些标准条件下给出了明确的收敛速率。我们的工作可以填补迭代过程中优化过程与收敛之间的差距:迭代需要收敛到全局极小值点,而在学习过程中得到的估计器不能保证全局最优性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a24d/11675914/7e71d8025c5c/entropy-26-01104-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a24d/11675914/bb26c0bc004d/entropy-26-01104-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a24d/11675914/7e71d8025c5c/entropy-26-01104-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a24d/11675914/bb26c0bc004d/entropy-26-01104-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a24d/11675914/7e71d8025c5c/entropy-26-01104-g002.jpg

相似文献

1
Stochastic Gradient Descent for Kernel-Based Maximum Correntropy Criterion.基于核最大相关熵准则的随机梯度下降
Entropy (Basel). 2024 Dec 17;26(12):1104. doi: 10.3390/e26121104.
2
Kernel Mixture Correntropy Conjugate Gradient Algorithm for Time Series Prediction.用于时间序列预测的核混合相关熵共轭梯度算法
Entropy (Basel). 2019 Aug 11;21(8):785. doi: 10.3390/e21080785.
3
Online Gradient Descent for Kernel-Based Maximum Correntropy Criterion.基于核最大相关熵准则的在线梯度下降法
Entropy (Basel). 2019 Jun 29;21(7):644. doi: 10.3390/e21070644.
4
Stochastic Gradient Descent for Nonconvex Learning Without Bounded Gradient Assumptions.无梯度有界假设下非凸学习的随机梯度下降法
IEEE Trans Neural Netw Learn Syst. 2020 Oct;31(10):4394-4400. doi: 10.1109/TNNLS.2019.2952219. Epub 2019 Dec 11.
5
Kernel Correntropy Conjugate Gradient Algorithms Based on Half-Quadratic Optimization.基于半二次优化的核相关熵共轭梯度算法
IEEE Trans Cybern. 2021 Nov;51(11):5497-5510. doi: 10.1109/TCYB.2019.2959834. Epub 2021 Nov 9.
6
Multikernel Correntropy for Robust Learning.多核相关熵的稳健学习。
IEEE Trans Cybern. 2022 Dec;52(12):13500-13511. doi: 10.1109/TCYB.2021.3110732. Epub 2022 Nov 18.
7
Robust Ellipse Fitting With Laplacian Kernel Based Maximum Correntropy Criterion.基于拉普拉斯核最大相关熵准则的鲁棒椭圆拟合
IEEE Trans Image Process. 2021;30:3127-3141. doi: 10.1109/TIP.2021.3058785. Epub 2021 Feb 24.
8
Newtonian-Type Adaptive Filtering Based on the Maximum Correntropy Criterion.基于最大相关熵准则的牛顿型自适应滤波
Entropy (Basel). 2020 Aug 22;22(9):922. doi: 10.3390/e22090922.
9
A Robust GPS Navigation Filter Based on Maximum Correntropy Criterion with Adaptive Kernel Bandwidth.一种基于具有自适应核带宽的最大相关熵准则的鲁棒GPS导航滤波器。
Sensors (Basel). 2023 Nov 24;23(23):9386. doi: 10.3390/s23239386.
10
Convergence and performance analysis of kernel regularized robust recursive least squares.核正则化鲁棒递归最小二乘法的收敛性与性能分析
ISA Trans. 2020 Oct;105:396-405. doi: 10.1016/j.isatra.2020.05.025. Epub 2020 May 18.

本文引用的文献

1
Online Gradient Descent for Kernel-Based Maximum Correntropy Criterion.基于核最大相关熵准则的在线梯度下降法
Entropy (Basel). 2019 Jun 29;21(7):644. doi: 10.3390/e21070644.
2
New Insights Into Learning With Correntropy-Based Regression.基于协方差的回归学习的新见解。
Neural Comput. 2021 Jan;33(1):157-173. doi: 10.1162/neco_a_01334. Epub 2020 Oct 20.
3
Kernel Correntropy Conjugate Gradient Algorithms Based on Half-Quadratic Optimization.基于半二次优化的核相关熵共轭梯度算法
IEEE Trans Cybern. 2021 Nov;51(11):5497-5510. doi: 10.1109/TCYB.2019.2959834. Epub 2021 Nov 9.
4
Stochastic Gradient Descent for Nonconvex Learning Without Bounded Gradient Assumptions.无梯度有界假设下非凸学习的随机梯度下降法
IEEE Trans Neural Netw Learn Syst. 2020 Oct;31(10):4394-4400. doi: 10.1109/TNNLS.2019.2952219. Epub 2019 Dec 11.
5
On the Generalization Ability of Online Gradient Descent Algorithm Under the Quadratic Growth Condition.在线性增长条件下的在线梯度下降算法的泛化能力。
IEEE Trans Neural Netw Learn Syst. 2018 Oct;29(10):5008-5019. doi: 10.1109/TNNLS.2017.2764960. Epub 2018 Jan 17.
6
A New Correntropy-Based Conjugate Gradient Backpropagation Algorithm for Improving Training in Neural Networks.基于新 Correntropy 的共轭梯度反向传播算法,用于改进神经网络训练。
IEEE Trans Neural Netw Learn Syst. 2018 Dec;29(12):6252-6263. doi: 10.1109/TNNLS.2018.2827778. Epub 2018 May 10.
7
Robust Hyperspectral Unmixing With Correntropy-Based Metric.基于相关熵度量的稳健高光谱解混。
IEEE Trans Image Process. 2015 Nov;24(11):4027-40. doi: 10.1109/TIP.2015.2456508. Epub 2015 Jul 15.
8
Maximum Correntropy Criterion for Robust Face Recognition.最大相关熵准则的鲁棒人脸识别。
IEEE Trans Pattern Anal Mach Intell. 2011 Aug;33(8):1561-76. doi: 10.1109/TPAMI.2010.220. Epub 2010 Dec 10.
9
An information theoretic approach of designing sparse kernel adaptive filters.一种设计稀疏核自适应滤波器的信息论方法。
IEEE Trans Neural Netw. 2009 Dec;20(12):1950-61. doi: 10.1109/TNN.2009.2033676. Epub 2009 Nov 17.