College of Engineering, Mathematics and Physical Sciences, University of Exeter, Exeter, U.K.
Neural Comput. 2010 Nov;22(11):2858-86. doi: 10.1162/NECO_a_00028.
We develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order 2 over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-established metric entropy integrals and pseudo-dimension of the set of candidate kernels. Our new methodology mainly depends on the principal theory of U-processes and entropy integrals. Finally, we establish satisfactory excess generalization bounds and misclassification error rates for learning gaussian kernels and general radial basis kernels.
我们提出了一种新的核学习问题的泛化界。首先,我们证明核学习问题的泛化分析可以归结为对候选核的 2 阶 Rademacher 混沌过程上的上确界的研究,我们称之为 Rademacher 混沌复杂度。接下来,我们展示了如何通过成熟的度量熵积分和候选核集的伪维数来估计经验 Rademacher 混沌复杂度。我们的新方法主要依赖于 U 过程和熵积分的基本理论。最后,我们建立了学习高斯核和广义径向基核的满意的过拟合泛化界和错误分类率。