IEEE Trans Neural Netw Learn Syst. 2014 May;25(5):970-9. doi: 10.1109/TNNLS.2013.2285242.
A neural network architecture incorporating time dependency explicitly, proposed recently, for modeling nonlinear nonstationary dynamic systems is further developed in this paper, and three alternate configurations are proposed to represent the dynamics of batch chemical processes. The first configuration consists of L subnets, each having M inputs representing the past samples of process inputs and output; each subnet has a hidden layer with polynomial activation function; the outputs of the hidden layer are combined and acted upon by an explicitly time-dependent modulation function. The outputs of all the subnets are summed to obtain the output prediction. In the second configuration, additional weights are incorporated to obtain a more generalized model. In the third configuration, the subnets are eliminated by incorporating an additional hidden layer consisting of L nodes. Backpropagation learning algorithm is formulated for each of the proposed neural network configuration to determine the weights, the polynomial coefficients, and the modulation function parameters. The modeling capability of the proposed neural network configuration is evaluated by employing it to represent the dynamics of a batch reactor in which a consecutive reaction takes place. The results show that all the three time-varying neural networks configurations are able to represent the batch reactor dynamics accurately, and it is found that the third configuration is exhibiting comparable or better performance over the other two configurations while requiring much smaller number of parameters. The modeling ability of the third configuration is further validated by applying to modeling a semibatch polymerization reactor challenge problem. This paper illustrates that the proposed approach can be applied to represent dynamics of any batch/semibatch process.
本文进一步开发了最近提出的一种明确纳入时间依赖性的神经网络架构,用于对非线性非平稳动态系统进行建模,并提出了三种替代配置来表示间歇化工过程的动态。第一种配置由 L 个子网络组成,每个子网络具有 M 个输入,代表过程输入和输出的过去样本;每个子网都有一个具有多项式激活函数的隐藏层;隐藏层的输出通过显式的时间依赖调制函数进行组合和作用。所有子网的输出相加得到输出预测。在第二种配置中,添加了额外的权重以获得更通用的模型。在第三种配置中,通过引入包含 L 个节点的额外隐藏层来消除子网。为每个提出的神经网络配置制定了反向传播学习算法,以确定权重、多项式系数和调制函数参数。通过将其应用于表示连续反应发生的间歇反应器的动态来评估所提出的神经网络配置的建模能力。结果表明,所有三种时变神经网络配置都能够准确地表示间歇反应器的动态,并且发现第三种配置在需要更少参数的情况下,表现出与其他两种配置相当或更好的性能。第三种配置的建模能力通过应用于建模半间歇聚合反应器挑战问题进一步得到验证。本文说明了所提出的方法可以应用于表示任何间歇/半间歇过程的动态。
IEEE Trans Neural Netw Learn Syst. 2014-5
Neural Netw. 2013-2-12
IEEE Trans Neural Netw. 1999
Neural Netw. 2003-12
IEEE Trans Neural Netw. 2008-1
Int J Neural Syst. 2000-10
IEEE Trans Neural Netw Learn Syst. 2013-4
Neural Comput. 2008-4
Int J Neural Syst. 2003-4
Int J Neural Syst. 2004-8