Wahab Abdul, Khan Shujaat
IEEE Trans Neural Netw Learn Syst. 2020 Mar;31(3):1066-1068. doi: 10.1109/TNNLS.2019.2899219. Epub 2019 Mar 12.
In this comment, we raise serious concerns over the derivation of the rate of convergence of fractional steepest descent algorithm in fractional adaptive learning approach presented in "Fractional Extreme Value Adaptive Training Method: Fractional Steepest Descent Approach." We substantiate that the estimate of the rate of convergence is grandiloquent. We also draw attention toward a critical flaw in the design of the algorithm stymieing its applicability for broad adaptive learning problems. Our claims are based on analytical reasoning supported by experimental results.
在本评论中,我们对发表于《分数极值自适应训练方法:分数最速下降法》的分数自适应学习方法中分数最速下降算法收敛速率的推导提出严重质疑。我们证实收敛速率的估计是夸大其词的。我们还提请注意该算法设计中的一个关键缺陷,这阻碍了它在广泛的自适应学习问题中的适用性。我们的主张基于实验结果支持的分析推理。