Key Lab of Industrial Computer Control Engineering of Hebei Province, Yanshan University, Qinhuangdao 066004, China; National Engineering Research Center for Equipment and Technology of Cold Strip Rolling, Qinhuangdao 066004, China.
Key Lab of Industrial Computer Control Engineering of Hebei Province, Yanshan University, Qinhuangdao 066004, China; National Engineering Research Center for Equipment and Technology of Cold Strip Rolling, Qinhuangdao 066004, China.
Neural Netw. 2014 Mar;51:57-66. doi: 10.1016/j.neunet.2013.12.006. Epub 2013 Dec 16.
This paper presents a novel artificial neural network with a very fast learning speed, all of whose weights and biases are determined by the twice Least Square method, so it is called Least Square Fast Learning Network (LSFLN). In addition, there is another difference from conventional neural networks, which is that the output neurons of LSFLN not only receive the information from the hidden layer neurons, but also receive the external information itself directly from the input neurons. In order to test the validity of LSFLN, it is applied to 6 classical regression applications, and also employed to build the functional relation between the combustion efficiency and operating parameters of a 300WM coal-fired boiler. Experimental results show that, compared with other methods, LSFLN with very less hidden neurons could achieve much better regression precision and generalization ability at a much faster learning speed.
本文提出了一种具有极快学习速度的新型人工神经网络,其所有权值和偏置均由二次最小二乘法确定,因此称为最小二乘快速学习网络(LSFLN)。此外,与传统神经网络还有另一个区别,即 LSFLN 的输出神经元不仅接收来自隐藏层神经元的信息,而且还直接从输入神经元接收外部信息。为了验证 LSFLN 的有效性,将其应用于 6 个经典回归应用,并用于建立 300WM 燃煤锅炉燃烧效率与运行参数之间的函数关系。实验结果表明,与其他方法相比,LSFLN 仅使用很少的隐藏神经元,在更快的学习速度下,就能实现更好的回归精度和泛化能力。