School of Mathematics, Hangzhou Normal University, Hangzhou, Zhejiang 310036, China.
Department of Mathematics and Statistics, St. Francis Xavier University, Antigonish, NS B2G 2W5, Canada.
Neural Netw. 2022 Sep;153:179-191. doi: 10.1016/j.neunet.2022.06.007. Epub 2022 Jun 10.
In this paper, we introduce a new type of interpolation operators by using Lagrange polynomials of degree r, which can be regarded as feedforward neural networks with four layers. The approximation rate of the new operators can be estimated by the (r+1)-th modulus of smoothness of the objective functions. By adding some smooth assumptions on the activation function, we establish two important inequalities of the derivatives of the operators. With these two inequalities, by using the K-functional and Berens-Lorentz lemma in approximation theory, we establish the converse theorem of approximation. We also give the Voronovskaja-type asymptotic estimation of the operators for smooth functions. Furthermore, we extend our operators to the multivariate case, and investigate their approximation properties for multivariate functions. Finally, some numerical examples are given to demonstrate the validity of the theoretical results obtained and the superiority of the operators.
在本文中,我们通过使用阶数为 r 的 Lagrange 多项式引入了一种新型的插值算子,它可以被视为具有四层的前馈神经网络。新算子的逼近速率可以通过目标函数的(r+1)阶光滑模来估计。通过对激活函数添加一些光滑假设,我们建立了算子导数的两个重要不等式。利用这两个不等式,通过使用逼近论中的 K-泛函和 Berens-Lorentz 引理,我们建立了逼近的逆定理。我们还给出了算子对光滑函数的 Voronovskaja 型渐近估计。此外,我们将算子推广到多元情形,并研究了它们对多元函数的逼近性质。最后,给出了一些数值例子来说明所得到的理论结果的有效性和算子的优越性。