Suppr超能文献

双曲型梯度算子和双曲型反向传播学习算法。

Hyperbolic Gradient Operator and Hyperbolic Back-Propagation Learning Algorithms.

出版信息

IEEE Trans Neural Netw Learn Syst. 2018 May;29(5):1689-1702. doi: 10.1109/TNNLS.2017.2677446. Epub 2017 Mar 23.

Abstract

In this paper, we first extend the Wirtinger derivative which is defined for complex functions to hyperbolic functions, and derive the hyperbolic gradient operator yielding the steepest descent direction by using it. Next, we derive the hyperbolic backpropagation learning algorithms for some multilayered hyperbolic neural networks (NNs) using the hyperbolic gradient operator. It is shown that the use of the Wirtinger derivative reduces the effort necessary for the derivation of the learning algorithms by half, simplifies the representation of the learning algorithms, and makes their computer programs easier to code. In addition, we discuss the differences between the derived Hyperbolic-BP rules and the complex-valued backpropagation learning rule (Complex-BP). Finally, we make some experiments with the derived learning algorithms. As a result, we find that the convergence rates of the Hyperbolic-BP learning algorithms are high even if the fully activation functions are used, and discover that the Hyperbolic-BP learning algorithm for the hyperbolic NN with the split-type hyperbolic activation function has an ability to learn hyperbolic rotation as its inherent property.

摘要

在本文中,我们首先将定义在复函数上的 Wirtinger 导数扩展到双曲函数,并使用它推导出产生最速下降方向的双曲梯度算子。接下来,我们使用双曲梯度算子推导出一些多层双曲神经网络(NN)的双曲反向传播学习算法。结果表明,使用 Wirtinger 导数将学习算法的推导所需的工作量减少了一半,简化了学习算法的表示,并使它们的计算机程序更容易编写。此外,我们讨论了所推导的双曲-BP 规则与复值反向传播学习规则(Complex-BP)之间的差异。最后,我们对推导的学习算法进行了一些实验。结果表明,即使使用完全激活函数,双曲-BP 学习算法的收敛速度也很高,并且发现具有分裂型双曲激活函数的双曲神经网络的双曲-BP 学习算法具有学习双曲旋转的固有能力。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验