Suppr超能文献

基于自适应复数步长的复数神经网络快速学习。

Adaptive complex-valued stepsize based fast learning of complex-valued neural networks.

机构信息

School of Electronics and Information Engineering, Soochow University, Suzhou 215006, PR China.

School of Electronics and Information Engineering, Soochow University, Suzhou 215006, PR China.

出版信息

Neural Netw. 2020 Apr;124:233-242. doi: 10.1016/j.neunet.2020.01.011. Epub 2020 Jan 25.

Abstract

Complex-valued gradient descent algorithm is a popular tool to optimize functions of complex variables, especially for the training of complex-valued neural networks. However, the choice of suitable learning stepsize is a challenging task during the training process. In this paper, an adaptive complex-valued stepsize design method is proposed for complex-valued neural networks by generalizing the adaptable learning rate tree technique to the complex domain. The scaling and rotation factors are introduced to simultaneously adjust the amplitude and phase of complex-valued stepsize. The search range is thus expanded from half line to half plane such that better search direction is obtained at each iteration. We analyze the dynamics of the algorithm near a saddle point and find that it is very easy to escape from the saddle point to guarantee fast convergence and high accuracy. Some experimental results on function approximation and pattern classification tasks are presented to illustrate the advantages of the proposed algorithm over some previous ones.

摘要

复梯度下降算法是优化复变量函数的一种流行工具,特别是在训练复值神经网络方面。然而,在训练过程中,选择合适的学习步长是一项具有挑战性的任务。本文通过将自适应学习率树技术推广到复域,提出了一种用于复值神经网络的自适应复值步长设计方法。引入了缩放和旋转因子,以同时调整复值步长的幅度和相位。因此,搜索范围从半线扩展到半平面,使得在每次迭代中都可以获得更好的搜索方向。我们分析了算法在鞍点附近的动力学特性,发现它非常容易从鞍点逃离,从而保证了快速收敛和高精度。在函数逼近和模式分类任务上的一些实验结果表明,该算法优于一些先前的算法。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验