IEEE Trans Cybern. 2017 Oct;47(10):3466-3479. doi: 10.1109/TCYB.2017.2734043. Epub 2017 Aug 21.
This paper contributes to the development of randomized methods for neural networks. The proposed learner model is generated incrementally by stochastic configuration (SC) algorithms, termed SC networks (SCNs). In contrast to the existing randomized learning algorithms for single layer feed-forward networks, we randomly assign the input weights and biases of the hidden nodes in the light of a supervisory mechanism, and the output weights are analytically evaluated in either a constructive or selective manner. As fundamentals of SCN-based data modeling techniques, we establish some theoretical results on the universal approximation property. Three versions of SC algorithms are presented for data regression and classification problems in this paper. Simulation results concerning both data regression and classification indicate some remarkable merits of our proposed SCNs in terms of less human intervention on the network size setting, the scope adaptation of random parameters, fast learning, and sound generalization.
本文为神经网络的随机化方法发展做出了贡献。所提出的学习器模型是通过随机配置(SC)算法逐步生成的,称为 SC 网络(SCN)。与现有用于单层前馈网络的随机化学习算法不同,我们根据监督机制随机分配隐藏节点的输入权重和偏差,并且以构造或选择性方式分析评估输出权重。作为基于 SCN 的数据建模技术的基础,我们在通用逼近性质方面建立了一些理论结果。本文针对数据回归和分类问题提出了三种 SC 算法版本。关于数据回归和分类的仿真结果表明,在网络大小设置、随机参数的范围适应性、快速学习和稳健泛化方面,我们提出的 SCN 具有一些显著的优点,需要人为干预的程度较低。