IEEE Trans Neural Netw Learn Syst. 2012 Feb;23(2):247-59. doi: 10.1109/TNNLS.2011.2178560.
Radial basis function neural networks (RBFNNs) are widely used in nonlinear function approximation. One of the challenges in RBFNN modeling is determining how to effectively optimize width parameters to improve approximation accuracy. To solve this problem, a width optimization method, concurrent subspace width optimization (CSWO), is proposed based on a decomposition and coordination strategy. This method decomposes the large-scale width optimization problem into several subspace optimization (SSO) problems, each of which has a single optimization variable and smaller training and validation data sets so as to greatly simplify optimization complexity. These SSOs can be solved concurrently, thus computational time can be effectively reduced. With top-level system coordination, the optimization of SSOs can converge to a consistent optimum, which is equivalent to the optimum of the original width optimization problem. The proposed method is tested with four mathematical examples and one practical engineering approximation problem. The results demonstrate the efficiency and robustness of CSWO in optimizing width parameters over the traditional width optimization methods.
径向基函数神经网络 (RBFNN) 在非线性函数逼近中得到了广泛应用。RBFNN 建模中的一个挑战是如何有效地优化宽度参数以提高逼近精度。为了解决这个问题,提出了一种基于分解协调策略的宽度优化方法,即并发子空间宽度优化 (CSWO)。该方法将大规模宽度优化问题分解为几个子空间优化 (SSO) 问题,每个 SSO 问题都有一个单一的优化变量和更小的训练和验证数据集,从而大大简化了优化的复杂性。这些 SSO 可以并发求解,从而有效减少计算时间。通过顶层系统协调,SSO 的优化可以收敛到一致的最优值,这相当于原始宽度优化问题的最优值。该方法通过四个数学示例和一个实际工程逼近问题进行了测试,结果表明 CSWO 在优化宽度参数方面优于传统的宽度优化方法,具有高效性和鲁棒性。