Suppr超能文献

深度 ReLU 神经网络在平移不变空间中的逼近。

Approximation in shift-invariant spaces with deep ReLU neural networks.

机构信息

Department of Mathematics, The Hong Kong University of Science and Technology, Clear Water Bay, Kowloon, Hong Kong; Theory Lab, Huawei Technologies Co., Ltd., Shenzhen, China.

Theory Lab, Huawei Technologies Co., Ltd., Shenzhen, China.

出版信息

Neural Netw. 2022 Sep;153:269-281. doi: 10.1016/j.neunet.2022.06.013. Epub 2022 Jun 16.

Abstract

We study the expressive power of deep ReLU neural networks for approximating functions in dilated shift-invariant spaces, which are widely used in signal processing, image processing, communications and so on. Approximation error bounds are estimated with respect to the width and depth of neural networks. The network construction is based on the bit extraction and data-fitting capacity of deep neural networks. As applications of our main results, the approximation rates of classical function spaces such as Sobolev spaces and Besov spaces are obtained. We also give lower bounds of the L(1≤p≤∞) approximation error for Sobolev spaces, which show that our construction of neural network is asymptotically optimal up to a logarithmic factor.

摘要

我们研究了深度 ReLU 神经网络在逼近扩展平移不变空间中的函数的表现力,这些函数在信号处理、图像处理、通信等领域有广泛应用。我们根据神经网络的宽度和深度来估计逼近误差界。网络结构基于深度神经网络的比特提取和数据拟合能力。作为主要结果的应用,我们得到了 Sobolev 空间和 Besov 空间等经典函数空间的逼近速率。我们还给出了 Sobolev 空间 L(1≤p≤∞)逼近误差的下界,这表明我们的神经网络构造在对数因子意义上是渐近最优的。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验