Karayiannis N B
Department of Electrical and Computer Engineering, University of Houston, Houston, TX 77204-4793, USA.
IEEE Trans Neural Netw. 1999;10(3):657-71. doi: 10.1109/72.761725.
This paper presents an axiomatic approach for constructing radial basis function (RBF) neural networks. This approach results in a broad variety of admissible RBF models, including those employing Gaussian RBF's. The form of the RBF's is determined by a generator function. New RBF models can be developed according to the proposed approach by selecting generator functions other than exponential ones, which lead to Gaussian RBF's. This paper also proposes a supervised learning algorithm based on gradient descent for training reformulated RBF neural networks constructed using the proposed approach. A sensitivity analysis of the proposed algorithm relates the properties of RBF's with the convergence of gradient descent learning. Experiments involving a variety of reformulated RBF networks generated by linear and exponential generator functions indicate that gradient descent learning is simple, easily implementable, and produces RBF networks that perform considerably better than conventional RBF models trained by existing algorithms.
本文提出了一种用于构建径向基函数(RBF)神经网络的公理方法。这种方法产生了各种各样可允许的RBF模型,包括那些采用高斯RBF的模型。RBF的形式由一个生成函数决定。通过选择除指数函数之外的生成函数(指数函数会产生高斯RBF),可以根据所提出的方法开发新的RBF模型。本文还提出了一种基于梯度下降的监督学习算法,用于训练使用所提出的方法构建的重新构造的RBF神经网络。对所提出算法的灵敏度分析将RBF的属性与梯度下降学习的收敛性联系起来。涉及由线性和指数生成函数生成的各种重新构造的RBF网络的实验表明,梯度下降学习简单、易于实现,并且所产生的RBF网络的性能比现有算法训练的传统RBF模型要好得多。