Lee T T, Jeng J T
Dept. of Electr. Eng., Nat. Taiwan Inst. of Technol., Taipei.
IEEE Trans Syst Man Cybern B Cybern. 1998;28(6):925-35. doi: 10.1109/3477.735405.
In this paper, we propose the approximate transformable technique, which includes the direct transformation and indirect transformation, to obtain a Chebyshev-Polynomials-Based (CPB) unified model neural networks for feedforward/recurrent neural networks via Chebyshev polynomials approximation. Based on this approximate transformable technique, we have derived the relationship between the single-layer neural networks and multilayer perceptron neural networks. It is shown that the CPB unified model neural networks can be represented as a functional link networks that are based on Chebyshev polynomials, and those networks use the recursive least square method with forgetting factor as learning algorithm. It turns out that the CPB unified model neural networks not only has the same capability of universal approximator, but also has faster learning speed than conventional feedforward/recurrent neural networks. Furthermore, we have also derived the condition such that the unified model generating by Chebyshev polynomials is optimal in the sense of error least square approximation in the single variable ease. Computer simulations show that the proposed method does have the capability of universal approximator in some functional approximation with considerable reduction in learning time.
在本文中,我们提出了近似可变换技术,该技术包括直接变换和间接变换,通过切比雪夫多项式逼近为前馈/递归神经网络获得基于切比雪夫多项式(CPB)的统一模型神经网络。基于这种近似可变换技术,我们推导了单层神经网络与多层感知器神经网络之间的关系。结果表明,CPB统一模型神经网络可以表示为基于切比雪夫多项式的函数链接网络,并且这些网络使用带遗忘因子的递归最小二乘法作为学习算法。结果表明,CPB统一模型神经网络不仅具有与通用逼近器相同的能力,而且比传统的前馈/递归神经网络具有更快的学习速度。此外,我们还推导了在单变量情况下,切比雪夫多项式生成的统一模型在误差最小二乘逼近意义下最优的条件。计算机仿真表明,所提出的方法在一些函数逼近中确实具有通用逼近器的能力,并且学习时间有显著减少。