Suykens J K, Vandewalle J
Department of Electrical Engineering, Katholieke Universiteit Leuven, ESAT-SISTA, Kardinaal Mercierlaan 94, B-3001 Leuven (Heverlee), Belgium.
IEEE Trans Neural Netw. 1999;10(4):907-11. doi: 10.1109/72.774254.
In this paper we describe a training method for one hidden layer multilayer perceptron classifier which is based on the idea of support vector machines (SVM's). An upper bound on the Vapnik-Chervonenkis (VC) dimension is iteratively minimized over the interconnection matrix of the hidden layer and its bias vector. The output weights are determined according to the support vector method, but without making use of the classifier form which is related to Mercer's condition. The method is illustrated on a two-spiral classification problem.
在本文中,我们描述了一种基于支持向量机(SVM)思想的单隐层多层感知器分类器的训练方法。通过在隐层的互连矩阵及其偏置向量上迭代最小化Vapnik-Chervonenkis(VC)维数的上界。输出权重根据支持向量法确定,但不使用与Mercer条件相关的分类器形式。该方法在双螺旋分类问题上得到了验证。