Imperial College, Department of Mathematics, London, United Kingdom.
Universität Sankt Gallen, Faculty of Mathematics and Statistics, Sankt Gallen, Switzerland; University of Warwick, Department of Statistics, United Kingdom.
Neural Netw. 2024 Nov;179:106486. doi: 10.1016/j.neunet.2024.106486. Epub 2024 Jun 22.
Reservoir computing approximation and generalization bounds are proved for a new concept class of input/output systems that extends the so-called generalized Barron functionals to a dynamic context. This new class is characterized by the readouts with a certain integral representation built on infinite-dimensional state-space systems. It is shown that this class is very rich and possesses useful features and universal approximation properties. The reservoir architectures used for the approximation and estimation of elements in the new class are randomly generated echo state networks with either linear or ReLU activation functions. Their readouts are built using randomly generated neural networks in which only the output layer is trained (extreme learning machines or random feature neural networks). The results in the paper yield a recurrent neural network-based learning algorithm with provable convergence guarantees that do not suffer from the curse of dimensionality when learning input/output systems in the class of generalized Barron functionals and measuring the error in a mean-squared sense.
提出了一种新的输入/输出系统概念类的存储库计算逼近和泛化界,该概念类将所谓的广义 Barron 泛函扩展到动态环境中。这个新类的特点是具有某种基于无限维状态空间系统的积分表示的读数。结果表明,这个类非常丰富,具有有用的特征和通用逼近性质。用于逼近和估计新类中元素的存储库架构是具有线性或 ReLU 激活函数的随机生成的回声状态网络。它们的读数是使用随机生成的神经网络构建的,其中仅训练输出层(极限学习机或随机特征神经网络)。本文中的结果产生了一种基于递归神经网络的学习算法,该算法具有可证明的收敛保证,在学习广义 Barron 泛函类中的输入/输出系统并以均方误差测量误差时,不会受到维度诅咒的影响。