Suppr超能文献

神经网络的高维函数高效逼近。

Efficient Approximation of High-Dimensional Functions With Neural Networks.

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):3079-3093. doi: 10.1109/TNNLS.2021.3049719. Epub 2022 Jul 6.

Abstract

In this article, we develop a framework for showing that neural networks can overcome the curse of dimensionality in different high-dimensional approximation problems. Our approach is based on the notion of a catalog network, which is a generalization of a standard neural network in which the nonlinear activation functions can vary from layer to layer as long as they are chosen from a predefined catalog of functions. As such, catalog networks constitute a rich family of continuous functions. We show that under appropriate conditions on the catalog, catalog networks can efficiently be approximated with rectified linear unit-type networks and provide precise estimates on the number of parameters needed for a given approximation accuracy. As special cases of the general results, we obtain different classes of functions that can be approximated with recitifed linear unit networks without the curse of dimensionality.

摘要

在本文中,我们提出了一个框架,用于展示神经网络可以克服不同高维逼近问题中的维数灾难。我们的方法基于目录网络的概念,目录网络是标准神经网络的推广,其中非线性激活函数可以在层与层之间变化,只要它们是从预定义的函数目录中选择的。因此,目录网络构成了一个丰富的连续函数家族。我们证明,在目录的适当条件下,目录网络可以用修正线性单元类型的网络有效地逼近,并提供给定逼近精度所需的参数数量的精确估计。作为一般结果的特例,我们得到了不同类别的函数,可以用修正线性单元网络无维数灾难地逼近。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验