Schmitt Michael
Lehrstuhl Mathematik und Informatik, Fakultät für Mathematik, Ruhr-Universität Bochum, D-44780 Bochum, Germany.
Neural Comput. 2005 Mar;17(3):715-29. doi: 10.1162/0899766053019953.
Higher-order neurons with k monomials in n variables are shown to have Vapnik-Chervonenkis (VC) dimension at least nk + 1. This result supersedes the previously known lower bound obtained via k-term monotone disjunctive normal form (DNF) formulas. Moreover, it implies that the VC dimension of higher-order neurons with k monomials is strictly larger than the VC dimension of k-term monotone DNF. The result is achieved by introducing an exponential approach that employs gaussian radial basis function neural networks for obtaining classifications of points in terms of higher-order neurons.
具有n个变量的k个单项式的高阶神经元被证明具有至少nk + 1的Vapnik-Chervonenkis(VC)维数。这一结果取代了之前通过k项单调析取范式(DNF)公式得到的已知下界。此外,这意味着具有k个单项式的高阶神经元的VC维数严格大于k项单调DNF的VC维数。该结果是通过引入一种指数方法实现的,该方法采用高斯径向基函数神经网络来根据高阶神经元对各点进行分类。