IEEE Trans Cybern. 2022 Dec;52(12):13500-13511. doi: 10.1109/TCYB.2021.3110732. Epub 2022 Nov 18.
As a novel similarity measure that is defined as the expectation of a kernel function between two random variables, correntropy has been successfully applied in robust machine learning and signal processing to combat large outliers. The kernel function in correntropy is usually a zero-mean Gaussian kernel. In a recent work, the concept of mixture correntropy (MC) was proposed to improve the learning performance, where the kernel function is a mixture Gaussian kernel, namely, a linear combination of several zero-mean Gaussian kernels with different widths. In both correntropy and MC, the center of the kernel function is, however, always located at zero. In the present work, to further improve the learning performance, we propose the concept of multikernel correntropy (MKC), in which each component of the mixture Gaussian kernel can be centered at a different location. The properties of the MKC are investigated and an efficient approach is proposed to determine the free parameters in MKC. Experimental results show that the learning algorithms under the maximum MKC criterion (MMKCC) can outperform those under the original maximum correntropy criterion (MCC) and the maximum MC criterion (MMCC).
作为一种新的相似性度量,相关熵被定义为两个随机变量之间核函数的期望,已成功应用于稳健机器学习和信号处理中,以抵御大离群值。相关熵中的核函数通常是零均值高斯核。在最近的一项工作中,提出了混合相关熵(MC)的概念,以提高学习性能,其中核函数是混合高斯核,即几个具有不同宽度的零均值高斯核的线性组合。在相关熵和 MC 中,核函数的中心始终位于零。在目前的工作中,为了进一步提高学习性能,我们提出了多核相关熵(MKC)的概念,其中混合高斯核的每个分量都可以位于不同的位置。研究了 MKC 的性质,并提出了一种确定 MKC 中自由参数的有效方法。实验结果表明,最大 MKC 准则(MMKCC)下的学习算法可以优于原始最大相关熵准则(MCC)和最大 MC 准则(MMCC)下的算法。