• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用深度 ReLU 网络逼近光滑泛函。

Approximation of smooth functionals using deep ReLU networks.

机构信息

School of Mathematical Science, Beihang University, Beijing, China; School of Data Science, City University of Hong Kong, Kowloon, Hong Kong.

Laboratory for AI-Powered Financial Technologies, Hong Kong Science Park, Shatin, New Territories, Hong Kong.

出版信息

Neural Netw. 2023 Sep;166:424-436. doi: 10.1016/j.neunet.2023.07.012. Epub 2023 Jul 18.

DOI:10.1016/j.neunet.2023.07.012
PMID:37549610
Abstract

In recent years, deep neural networks have been employed to approximate nonlinear continuous functionals F defined on L([-1,1]) for 1≤p≤∞. However, the existing theoretical analysis in the literature either is unsatisfactory due to the poor approximation results, or does not apply to the rectified linear unit (ReLU) activation function. This paper aims to investigate the approximation power of functional deep ReLU networks in two settings: F is continuous with restrictions on the modulus of continuity, and F has higher order Fréchet derivatives. A novel functional network structure is proposed to extract features of higher order smoothness harbored by the target functional F. Quantitative rates of approximation in terms of the depth, width and total number of weights of neural networks are derived for both settings. We give logarithmic rates when measuring the approximation error on the unit ball of a Hölder space. In addition, we establish nearly polynomial rates (i.e., rates of the form exp-a(logM) with a>0,0<b<1) when measuring the approximation error on a space of analytic functions.

摘要

近年来,深度神经网络已被用于逼近定义在 L([-1,1]) 上的 1≤p≤∞ 的非线性连续泛函 F。然而,文献中的现有理论分析要么由于逼近效果不佳而不令人满意,要么不适用于修正线性单元 (ReLU) 激活函数。本文旨在研究函数深度 ReLU 网络在两种情况下的逼近能力:F 是连续的,且在连续性模上有约束,F 具有高阶 Fréchet 导数。提出了一种新的函数网络结构,以提取目标函数 F 所具有的更高阶平滑性特征。针对这两种情况,推导出了网络深度、宽度和总权重数量的定量逼近率。当在 Hölder 空间的单位球上度量逼近误差时,我们给出了对数率。此外,当在解析函数空间上度量逼近误差时,我们建立了几乎多项式率(即形如 exp-a(logM) 的率,其中 a>0,0<b<1)。

相似文献

1
Approximation of smooth functionals using deep ReLU networks.使用深度 ReLU 网络逼近光滑泛函。
Neural Netw. 2023 Sep;166:424-436. doi: 10.1016/j.neunet.2023.07.012. Epub 2023 Jul 18.
2
Deep ReLU neural networks in high-dimensional approximation.高维逼近中的深度 ReLU 神经网络。
Neural Netw. 2021 Oct;142:619-635. doi: 10.1016/j.neunet.2021.07.027. Epub 2021 Jul 29.
3
Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations.使用具有分段多项式激活函数的深度神经网络对光滑函数及其导数进行同时逼近。
Neural Netw. 2023 Apr;161:242-253. doi: 10.1016/j.neunet.2023.01.035. Epub 2023 Feb 2.
4
Neural networks with ReLU powers need less depth.ReLU 激活函数的神经网络需要的深度更小。
Neural Netw. 2024 Apr;172:106073. doi: 10.1016/j.neunet.2023.12.027. Epub 2023 Dec 19.
5
Optimal approximation of piecewise smooth functions using deep ReLU neural networks.使用深度 ReLU 神经网络对分段光滑函数进行最优逼近。
Neural Netw. 2018 Dec;108:296-330. doi: 10.1016/j.neunet.2018.08.019. Epub 2018 Sep 7.
6
Nonlinear approximation via compositions.通过组合进行非线性逼近。
Neural Netw. 2019 Nov;119:74-84. doi: 10.1016/j.neunet.2019.07.011. Epub 2019 Aug 2.
7
Simultaneous neural network approximation for smooth functions.用于光滑函数的同时神经网络逼近。
Neural Netw. 2022 Oct;154:152-164. doi: 10.1016/j.neunet.2022.06.040. Epub 2022 Jul 9.
8
Approximation in shift-invariant spaces with deep ReLU neural networks.深度 ReLU 神经网络在平移不变空间中的逼近。
Neural Netw. 2022 Sep;153:269-281. doi: 10.1016/j.neunet.2022.06.013. Epub 2022 Jun 16.
9
Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth.近似误差为深度平方根幂次的宽度倒数的深度网络。
Neural Comput. 2021 Mar 26;33(4):1005-1036. doi: 10.1162/neco_a_01364.
10
Smooth Function Approximation by Deep Neural Networks with General Activation Functions.具有通用激活函数的深度神经网络对光滑函数的逼近
Entropy (Basel). 2019 Jun 26;21(7):627. doi: 10.3390/e21070627.