Roh Seok-Beom, Oh Sung-Kwun, Pedrycz Witold, Fu Zunwei
IEEE Trans Neural Netw Learn Syst. 2022 Apr;33(4):1385-1399. doi: 10.1109/TNNLS.2020.3041947. Epub 2022 Apr 4.
The two issues on dynamically generated hierarchical neural networks such as the sort of basic neurons and how to compose a layer are considered in this article. On the first issue, a variant version of the least-square support vector regression (SVR) is chosen as a basic neuron. Support vector machine (SVM) is a representative classifier which usually shows good classification performance. Along with the SVMs, SVR was introduced to deal with the regression problem. Especially, least-square SVR has the advantages of high learning speed due to the substitution of the inequality constraints by the equality constraint in the formulation of the optimization problem. Based on the least-square SVR, the multiple least-square (MLS) SVR, which is a type of a linear combination of least-square SVRs with fuzzy clustering, is proposed to improve the modeling performance. In addition, a hierarchical neural network, where the MLS SVR is utilized as the generic node instead of the conventional polynomial, is developed. The key issues of hierarchical neural networks, which are generated dynamically layer by layer, are discussed on how to retain the diversity of the nodes located at the same layer according to the increase of the layer. In order to maintain the diversity of the nodes, various selection methods such as truncation selection and roulette wheel selection (RWS) to choose the nodes among candidate nodes are proposed. In addition, in order to reduce the computational overhead to determine all candidates which exhibit all compositions of the input variables, a new implementation method is proposed. From the viewpoint of the diversity of the selected nodes and the computational aspects, it is shown that the proposed method is preferred over the conventional design methodology.
本文考虑了动态生成的分层神经网络的两个问题,例如基本神经元的类型以及如何组成一层。关于第一个问题,选择最小二乘支持向量回归(SVR)的一个变体版本作为基本神经元。支持向量机(SVM)是一种代表性的分类器,通常表现出良好的分类性能。随着支持向量机的出现,支持向量回归被引入来处理回归问题。特别是,最小二乘支持向量回归由于在优化问题的公式中用等式约束代替了不等式约束而具有学习速度快的优点。基于最小二乘支持向量回归,提出了多重最小二乘(MLS)支持向量回归,它是一种将最小二乘支持向量回归与模糊聚类进行线性组合的类型,以提高建模性能。此外,还开发了一种分层神经网络,其中将MLS支持向量回归用作通用节点而不是传统的多项式。讨论了逐层动态生成的分层神经网络的关键问题,即如何根据层数的增加保持同一层节点的多样性。为了保持节点的多样性,提出了各种选择方法,如截断选择和轮盘赌选择(RWS),用于在候选节点中选择节点。此外,为了减少确定所有表现出输入变量所有组合的候选者的计算开销,提出了一种新的实现方法。从所选节点的多样性和计算方面来看,结果表明所提出的方法优于传统的设计方法。