• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

具有通用激活函数的深度神经网络对光滑函数的逼近

Smooth Function Approximation by Deep Neural Networks with General Activation Functions.

作者信息

Ohn Ilsang, Kim Yongdai

机构信息

Department of Statistics, Seoul National University, Seoul 08826, Korea.

出版信息

Entropy (Basel). 2019 Jun 26;21(7):627. doi: 10.3390/e21070627.

DOI:10.3390/e21070627
PMID:33267341
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7515121/
Abstract

There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class of activation functions. This class of activation functions includes most of frequently used activation functions. We derive the required depth, width and sparsity of a deep neural network to approximate any Hölder smooth function upto a given approximation error for the large class of activation functions. Based on our approximation error analysis, we derive the minimax optimality of the deep neural network estimators with the general activation functions in both regression and classification problems.

摘要

人们对深度神经网络的表现力越来越感兴趣。然而,关于这个主题的现有工作大多只关注特定的激活函数,如ReLU或sigmoid。在本文中,我们研究了具有广泛激活函数类别的深度神经网络的逼近能力。这类激活函数包括了大多数常用的激活函数。我们推导了深度神经网络所需的深度、宽度和稀疏性,以便在给定的逼近误差范围内,对一大类激活函数逼近任何赫尔德平滑函数。基于我们的逼近误差分析,我们推导了在回归和分类问题中具有一般激活函数的深度神经网络估计器的极小极大最优性。

相似文献

1
Smooth Function Approximation by Deep Neural Networks with General Activation Functions.具有通用激活函数的深度神经网络对光滑函数的逼近
Entropy (Basel). 2019 Jun 26;21(7):627. doi: 10.3390/e21070627.
2
Simultaneous approximation of a smooth function and its derivatives by deep neural networks with piecewise-polynomial activations.使用具有分段多项式激活函数的深度神经网络对光滑函数及其导数进行同时逼近。
Neural Netw. 2023 Apr;161:242-253. doi: 10.1016/j.neunet.2023.01.035. Epub 2023 Feb 2.
3
Approximation of smooth functionals using deep ReLU networks.使用深度 ReLU 网络逼近光滑泛函。
Neural Netw. 2023 Sep;166:424-436. doi: 10.1016/j.neunet.2023.07.012. Epub 2023 Jul 18.
4
Optimal approximation of piecewise smooth functions using deep ReLU neural networks.使用深度 ReLU 神经网络对分段光滑函数进行最优逼近。
Neural Netw. 2018 Dec;108:296-330. doi: 10.1016/j.neunet.2018.08.019. Epub 2018 Sep 7.
5
Low dimensional approximation and generalization of multivariate functions on smooth manifolds using deep ReLU neural networks.基于深度 ReLU 神经网络的光滑流形上多元函数的低维逼近和泛化。
Neural Netw. 2024 Jun;174:106223. doi: 10.1016/j.neunet.2024.106223. Epub 2024 Mar 1.
6
Neural networks with ReLU powers need less depth.ReLU 激活函数的神经网络需要的深度更小。
Neural Netw. 2024 Apr;172:106073. doi: 10.1016/j.neunet.2023.12.027. Epub 2023 Dec 19.
7
Simultaneous neural network approximation for smooth functions.用于光滑函数的同时神经网络逼近。
Neural Netw. 2022 Oct;154:152-164. doi: 10.1016/j.neunet.2022.06.040. Epub 2022 Jul 9.
8
Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth.近似误差为深度平方根幂次的宽度倒数的深度网络。
Neural Comput. 2021 Mar 26;33(4):1005-1036. doi: 10.1162/neco_a_01364.
9
Deep ReLU neural networks in high-dimensional approximation.高维逼近中的深度 ReLU 神经网络。
Neural Netw. 2021 Oct;142:619-635. doi: 10.1016/j.neunet.2021.07.027. Epub 2021 Jul 29.
10
Approximate Policy Iteration With Deep Minimax Average Bellman Error Minimization.基于深度极小极大平均贝尔曼误差最小化的近似策略迭代
IEEE Trans Neural Netw Learn Syst. 2025 Feb;36(2):2288-2299. doi: 10.1109/TNNLS.2023.3346992. Epub 2025 Feb 6.

引用本文的文献

1
A new method for Tomicus classification of forest pests based on improved ResNet50 algorithm.一种基于改进ResNet50算法的森林害虫松材线虫分类新方法。
Sci Rep. 2025 Mar 20;15(1):9665. doi: 10.1038/s41598-025-93407-5.
2
A maize seed variety identification method based on improving deep residual convolutional network.一种基于改进深度残差卷积网络的玉米种子品种鉴定方法。
Front Plant Sci. 2024 May 13;15:1382715. doi: 10.3389/fpls.2024.1382715. eCollection 2024.
3
Enhancing solids deposit prediction in gully pots with explainable hybrid models: A review.利用可解释的混合模型增强沟壑壶中固体沉积物预测:综述。
Water Sci Technol. 2024 Apr;89(8):1891-1912. doi: 10.2166/wst.2024.077. Epub 2024 Mar 12.
4
High-Performance Statistical Computing in the Computing Environments of the 2020s.2020年代计算环境中的高性能统计计算
Stat Sci. 2022 Nov;37(4):494-518. doi: 10.1214/21-sts835. Epub 2022 Oct 13.
5
Development of an IoT Architecture Based on a Deep Neural Network against Cyber Attacks for Automated Guided Vehicles.基于深度神经网络的物联网架构开发,以应对自动化引导车辆的网络攻击。
Sensors (Basel). 2021 Dec 18;21(24):8467. doi: 10.3390/s21248467.
6
Diagnosing COVID-19 disease using an efficient CAD system.使用高效的计算机辅助检测(CAD)系统诊断新冠肺炎。
Optik (Stuttg). 2021 Sep;241:167199. doi: 10.1016/j.ijleo.2021.167199. Epub 2021 May 18.

本文引用的文献

1
Fast convergence rates of deep neural networks for classification.用于分类的深度神经网络的快速收敛率。
Neural Netw. 2021 Jun;138:179-197. doi: 10.1016/j.neunet.2021.02.012. Epub 2021 Feb 23.
2
A comparison of deep networks with ReLU activation function and linear spline-type methods.ReLU 激活函数的深度网络与线性样条型方法的比较。
Neural Netw. 2019 Feb;110:232-242. doi: 10.1016/j.neunet.2018.11.005. Epub 2018 Dec 4.
3
Optimal approximation of piecewise smooth functions using deep ReLU neural networks.使用深度 ReLU 神经网络对分段光滑函数进行最优逼近。
Neural Netw. 2018 Dec;108:296-330. doi: 10.1016/j.neunet.2018.08.019. Epub 2018 Sep 7.
4
Error bounds for approximations with deep ReLU networks.深度 ReLU 网络逼近的误差界。
Neural Netw. 2017 Oct;94:103-114. doi: 10.1016/j.neunet.2017.07.002. Epub 2017 Jul 13.
5
Deep learning.深度学习。
Nature. 2015 May 28;521(7553):436-44. doi: 10.1038/nature14539.