Leung Chi-Sing, Lam Ping-Man
The City University of Hong Kong, Kowloon Tong, Hong Kong, China.
Int J Neural Syst. 2003 Feb;13(1):25-38. doi: 10.1142/S0129065703001376.
The global extended Kalman filtering (EKF) algorithm for recurrent neural networks (RNNs) is plagued by the drawback of high computational cost and storage requirement. In this paper, we present a local EKF training-pruning approach that can solve this problem. In particular, the by-products, obtained along with the local EKF training, can be utilized to measure the importance of the network weights. Comparing with the original global approach, the proposed local approach results in much lower computational cost and storage requirement. Hence, it is more practical in solving real world problems. Simulation showed that our approach is an effective joint-training-pruning method for RNNs under online operation.
用于递归神经网络(RNN)的全局扩展卡尔曼滤波(EKF)算法存在计算成本高和存储需求大的缺点。在本文中,我们提出了一种局部EKF训练-剪枝方法来解决这个问题。具体而言,在局部EKF训练过程中获得的副产品可用于衡量网络权重的重要性。与原始的全局方法相比,所提出的局部方法具有更低的计算成本和存储需求。因此,它在解决实际问题中更具实用性。仿真表明,我们的方法是一种适用于在线运行的RNN的有效联合训练-剪枝方法。