Zjavka Ladislav, Pedrycz Witold
VŠB-Technical University of Ostrava, Faculty of Electrical Engineering and Computer Science, Department of Computer Science, 17. listopadu 15/2172 Ostrava, Czech Republic.
Department of Electrical & Computer Engineering, University of Alberta, Edmonton T6R 2V4 AB, Canada; Department of Electrical and Computer Engineering, Faculty of Engineering, King Abdulaziz University, Jeddah, 21589, Saudi Arabia; Systems Research Institute, Polish Academy of Sciences Warsaw, Poland.
Neural Netw. 2016 Jan;73:58-69. doi: 10.1016/j.neunet.2015.10.001. Epub 2015 Oct 20.
Sum fraction terms can approximate multi-variable functions on the basis of discrete observations, replacing a partial differential equation definition with polynomial elementary data relation descriptions. Artificial neural networks commonly transform the weighted sum of inputs to describe overall similarity relationships of trained and new testing input patterns. Differential polynomial neural networks form a new class of neural networks, which construct and solve an unknown general partial differential equation of a function of interest with selected substitution relative terms using non-linear multi-variable composite polynomials. The layers of the network generate simple and composite relative substitution terms whose convergent series combinations can describe partial dependent derivative changes of the input variables. This regression is based on trained generalized partial derivative data relations, decomposed into a multi-layer polynomial network structure. The sigmoidal function, commonly used as a nonlinear activation of artificial neurons, may transform some polynomial items together with the parameters with the aim to improve the polynomial derivative term series ability to approximate complicated periodic functions, as simple low order polynomials are not able to fully make up for the complete cycles. The similarity analysis facilitates substitutions for differential equations or can form dimensional units from data samples to describe real-world problems.
和分式项可以基于离散观测值来近似多变量函数,用多项式基本数据关系描述取代偏微分方程定义。人工神经网络通常会变换输入的加权和,以描述训练后的输入模式与新测试输入模式的整体相似关系。微分多项式神经网络构成了一类新的神经网络,它使用非线性多变量复合多项式,通过选定的代换相关项来构建并求解感兴趣函数的未知一般偏微分方程。网络层生成简单和复合的相关代换项,其收敛级数组合可以描述输入变量的部分相关导数变化。这种回归基于经过训练的广义偏导数数据关系,分解为多层多项式网络结构。常用作人工神经元非线性激活函数的S形函数,可能会将一些多项式项与参数一起变换,目的是提高多项式导数项级数近似复杂周期函数的能力,因为简单的低阶多项式无法完全弥补完整的周期。相似性分析有助于微分方程的代换,或者可以从数据样本形成维度单位来描述实际问题。