IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):3079-3093. doi: 10.1109/TNNLS.2021.3049719. Epub 2022 Jul 6.
In this article, we develop a framework for showing that neural networks can overcome the curse of dimensionality in different high-dimensional approximation problems. Our approach is based on the notion of a catalog network, which is a generalization of a standard neural network in which the nonlinear activation functions can vary from layer to layer as long as they are chosen from a predefined catalog of functions. As such, catalog networks constitute a rich family of continuous functions. We show that under appropriate conditions on the catalog, catalog networks can efficiently be approximated with rectified linear unit-type networks and provide precise estimates on the number of parameters needed for a given approximation accuracy. As special cases of the general results, we obtain different classes of functions that can be approximated with recitifed linear unit networks without the curse of dimensionality.
在本文中,我们提出了一个框架,用于展示神经网络可以克服不同高维逼近问题中的维数灾难。我们的方法基于目录网络的概念,目录网络是标准神经网络的推广,其中非线性激活函数可以在层与层之间变化,只要它们是从预定义的函数目录中选择的。因此,目录网络构成了一个丰富的连续函数家族。我们证明,在目录的适当条件下,目录网络可以用修正线性单元类型的网络有效地逼近,并提供给定逼近精度所需的参数数量的精确估计。作为一般结果的特例,我们得到了不同类别的函数,可以用修正线性单元网络无维数灾难地逼近。