Suppr超能文献

基于 Wirtinger 微积分的全复型反向传播算法的收敛性分析。

Convergence analysis of fully complex backpropagation algorithm based on Wirtinger calculus.

机构信息

Department of Mathematics, Dalian Maritime University, Dalian, 116026 People's Republic of China ; Research Center of Information and Control, Dalian University of Technology, Dalian, 116024 People's Republic of China.

Research Center of Information and Control, Dalian University of Technology, Dalian, 116024 People's Republic of China.

出版信息

Cogn Neurodyn. 2014 Jun;8(3):261-6. doi: 10.1007/s11571-013-9276-7. Epub 2014 Jan 3.

Abstract

This paper considers the fully complex backpropagation algorithm (FCBPA) for training the fully complex-valued neural networks. We prove both the weak convergence and strong convergence of FCBPA under mild conditions. The decreasing monotonicity of the error functions during the training process is also obtained. The derivation and analysis of the algorithm are under the framework of Wirtinger calculus, which greatly reduces the description complexity. The theoretical results are substantiated by a simulation example.

摘要

本文考虑了用于训练全复值神经网络的完全复数值反向传播算法(FCBPA)。我们在温和的条件下证明了 FCBPA 的弱收敛性和强收敛性。还得到了在训练过程中误差函数的递减单调性。算法的推导和分析是在 Wirtinger 演算的框架下进行的,这大大降低了描述的复杂性。理论结果通过一个模拟示例得到了证实。

相似文献

1
Convergence analysis of fully complex backpropagation algorithm based on Wirtinger calculus.
Cogn Neurodyn. 2014 Jun;8(3):261-6. doi: 10.1007/s11571-013-9276-7. Epub 2014 Jan 3.
2
Convergence analysis of an augmented algorithm for fully complex-valued neural networks.
Neural Netw. 2015 Sep;69:44-50. doi: 10.1016/j.neunet.2015.05.003. Epub 2015 May 27.
4
Deterministic convergence of chaos injection-based gradient method for training feedforward neural networks.
Cogn Neurodyn. 2015 Jun;9(3):331-40. doi: 10.1007/s11571-014-9323-z. Epub 2015 Jan 1.
5
Hyperbolic Gradient Operator and Hyperbolic Back-Propagation Learning Algorithms.
IEEE Trans Neural Netw Learn Syst. 2018 May;29(5):1689-1702. doi: 10.1109/TNNLS.2017.2677446. Epub 2017 Mar 23.
6
Fractional-order gradient descent learning of BP neural networks with Caputo derivative.
Neural Netw. 2017 May;89:19-30. doi: 10.1016/j.neunet.2017.02.007. Epub 2017 Feb 22.
7
Backpropagation and ordered derivatives in the time scales calculus.
IEEE Trans Neural Netw. 2010 Aug;21(8):1262-9. doi: 10.1109/TNN.2010.2050332. Epub 2010 Jul 8.
8
Study of the convergence behavior of the complex kernel least mean square algorithm.
IEEE Trans Neural Netw Learn Syst. 2013 Sep;24(9):1349-63. doi: 10.1109/TNNLS.2013.2256367.
10
Kurtosis-Based CRTRL Algorithms for Fully Connected Recurrent Neural Networks.
IEEE Trans Neural Netw Learn Syst. 2018 Dec;29(12):6123-6131. doi: 10.1109/TNNLS.2018.2826442. Epub 2018 May 1.

引用本文的文献

1
Event-based exponential synchronization of complex networks.
Cogn Neurodyn. 2016 Oct;10(5):423-36. doi: 10.1007/s11571-016-9391-3. Epub 2016 Jun 6.
2
Deterministic convergence of chaos injection-based gradient method for training feedforward neural networks.
Cogn Neurodyn. 2015 Jun;9(3):331-40. doi: 10.1007/s11571-014-9323-z. Epub 2015 Jan 1.

本文引用的文献

1
A computational neural model of orientation detection based on multiple guesses: comparison of geometrical and algebraic models.
Cogn Neurodyn. 2013 Oct;7(5):361-79. doi: 10.1007/s11571-012-9235-8. Epub 2012 Dec 25.
2
Local minima in hierarchical structures of complex-valued neural networks.
Neural Netw. 2013 Jul;43:1-7. doi: 10.1016/j.neunet.2013.02.002. Epub 2013 Feb 18.
3
A Kalman filtering approach to the representation of kinematic quantities by the hippocampal-entorhinal complex.
Cogn Neurodyn. 2010 Dec;4(4):315-35. doi: 10.1007/s11571-010-9115-z. Epub 2010 Jun 8.
5
Boundedness and convergence of online gradient method with penalty for feedforward neural networks.
IEEE Trans Neural Netw. 2009 Jun;20(6):1050-4. doi: 10.1109/TNN.2009.2020848. Epub 2009 May 8.
6
Deterministic convergence of an online gradient method for BP neural networks.
IEEE Trans Neural Netw. 2005 May;16(3):533-40. doi: 10.1109/TNN.2005.844903.
7
Approximation by fully complex multilayer perceptrons.
Neural Comput. 2003 Jul;15(7):1641-66. doi: 10.1162/089976603321891846.
8
An Extension of the Back-Propagation Algorithm to Complex Numbers.
Neural Netw. 1997 Nov;10(8):1391-1415. doi: 10.1016/s0893-6080(97)00036-1.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验