Ponnapalli P S, Ho K C, Thomson M
IEEE Trans Neural Netw. 1999;10(4):964-8. doi: 10.1109/72.774273.
A formal selection and pruning technique based on the concept of local relative sensitivity index is proposed for feedforward artificial neural networks. The mechanism of backpropagation training algorithm is revisited and the theoretical foundation of the improved selection and pruning technique is presented. This technique is based on parallel pruning of weights which are relatively redundant in a subgroup of a feedforward neural network. Comparative studies with a similar technique proposed in the literature show that the improved technique provides better pruning results in terms of reduction of model residues, improvement of generalization capability and reduction of network complexity. The effectiveness of the improved technique is demonstrated in developing neural network (NN) models of a number of nonlinear systems including three bit parity problem, Van der Pol equation, a chemical processes and two nonlinear discrete-time systems using the backpropagation training algorithm with adaptive learning rate.
针对前馈人工神经网络,提出了一种基于局部相对灵敏度指数概念的形式化选择与剪枝技术。重新审视了反向传播训练算法的机制,并给出了改进的选择与剪枝技术的理论基础。该技术基于对前馈神经网络子组中相对冗余的权重进行并行剪枝。与文献中提出的类似技术的比较研究表明,改进后的技术在减少模型残差、提高泛化能力和降低网络复杂度方面提供了更好的剪枝效果。利用具有自适应学习率的反向传播训练算法,在开发包括三位奇偶校验问题、范德波尔方程、一个化学过程和两个非线性离散时间系统在内的多个非线性系统的神经网络(NN)模型中,证明了改进技术的有效性。