Wang Baobin, Hu Ting
School of Mathematics and Statistics, South-Central University for Nationalities, Wuhan 430074, China.
School of Mathematics and Statistics, Wuhan University, Wuhan 430072, China.
Entropy (Basel). 2019 Jun 29;21(7):644. doi: 10.3390/e21070644.
In the framework of statistical learning, we study the online gradient descent algorithm generated by the correntropy-induced losses in Reproducing kernel Hilbert spaces (RKHS). As a generalized correlation measurement, correntropy has been widely applied in practice, owing to its prominent merits on robustness. Although the online gradient descent method is an efficient way to deal with the maximum correntropy criterion (MCC) in non-parameter estimation, there has been no consistency in analysis or rigorous error bounds. We provide a theoretical understanding of the online algorithm for MCC, and show that, with a suitable chosen scaling parameter, its convergence rate can be min-max optimal (up to a logarithmic factor) in the regression analysis. Our results show that the scaling parameter plays an essential role in both robustness and consistency.
在统计学习框架下,我们研究再生核希尔伯特空间(RKHS)中由核相关损失生成的在线梯度下降算法。作为一种广义相关度量,核相关由于其在鲁棒性方面的突出优点而在实践中得到了广泛应用。尽管在线梯度下降方法是在非参数估计中处理最大核相关准则(MCC)的有效方式,但在分析或严格误差界方面一直没有一致性。我们对MCC的在线算法给出了理论理解,并表明,通过适当选择缩放参数,其收敛速率在回归分析中可以是极小极大最优的(相差一个对数因子)。我们的结果表明,缩放参数在鲁棒性和一致性方面都起着至关重要的作用。