Department of Mathematics, Dalian Maritime University, Dalian, 116026 People's Republic of China ; Research Center of Information and Control, Dalian University of Technology, Dalian, 116024 People's Republic of China.
Research Center of Information and Control, Dalian University of Technology, Dalian, 116024 People's Republic of China.
Cogn Neurodyn. 2014 Jun;8(3):261-6. doi: 10.1007/s11571-013-9276-7. Epub 2014 Jan 3.
This paper considers the fully complex backpropagation algorithm (FCBPA) for training the fully complex-valued neural networks. We prove both the weak convergence and strong convergence of FCBPA under mild conditions. The decreasing monotonicity of the error functions during the training process is also obtained. The derivation and analysis of the algorithm are under the framework of Wirtinger calculus, which greatly reduces the description complexity. The theoretical results are substantiated by a simulation example.
本文考虑了用于训练全复值神经网络的完全复数值反向传播算法(FCBPA)。我们在温和的条件下证明了 FCBPA 的弱收敛性和强收敛性。还得到了在训练过程中误差函数的递减单调性。算法的推导和分析是在 Wirtinger 演算的框架下进行的,这大大降低了描述的复杂性。理论结果通过一个模拟示例得到了证实。