Chen Xiaoming, Tang Zheng, Variappan Catherine, Li Songsong, Okada Toshimi
Faculty of Engineering, Toyama University, 3190 Gofuku, Toyama-shi, Toyama 930-8555, Japan.
Int J Neural Syst. 2005 Dec;15(6):435-43. doi: 10.1142/S0129065705000426.
The complex-valued backpropagation algorithm has been widely used in fields of dealing with telecommunications, speech recognition and image processing with Fourier transformation. However, the local minima problem usually occurs in the process of learning. To solve this problem and to speed up the learning process, we propose a modified error function by adding a term to the conventional error function, which is corresponding to the hidden layer error. The simulation results show that the proposed algorithm is capable of preventing the learning from sticking into the local minima and of speeding up the learning.
复值反向传播算法已广泛应用于利用傅里叶变换处理电信、语音识别和图像处理的领域。然而,在学习过程中通常会出现局部极小值问题。为了解决这个问题并加快学习过程,我们通过在传统误差函数中添加一项来提出一种修正的误差函数,该项对应于隐藏层误差。仿真结果表明,所提出的算法能够防止学习陷入局部极小值并加快学习速度。