• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Universal Approximation Abilities of a Modular Differentiable Neural Network.

作者信息

Wang Jian, Wu Shujun, Zhang Huaqing, Yuan Bin, Dai Caili, Pal Nikhil R

出版信息

IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):5586-5600. doi: 10.1109/TNNLS.2024.3378697. Epub 2025 Feb 28.

DOI:10.1109/TNNLS.2024.3378697
PMID:38568758
Abstract

Approximation ability is one of the most important topics in the field of neural networks (NNs). Feedforward NNs, activated by rectified linear units and some of their specific smoothed versions, provide universal approximators to convex as well as continuous functions. However, most of these networks are investigated empirically, or their characteristics are analyzed based on specific operation rules. Moreover, an adequate level of interpretability of the networks is missing as well. In this work, we propose a class of new network architecture, built with reusable neural modules (functional blocks), to supply differentiable and interpretable approximators for convex and continuous target functions. Specifically, first, we introduce a concrete model construction mechanism with particular blocks based on differentiable programming and the composition essence of the max operator, extending the scope of existing activation functions. Moreover, explicit block diagrams are provided for a clear understanding of the external architecture and the internal processing mechanism. Subsequently, the approximation behavior of the proposed network to convex functions and continuous functions is rigorously proved as well, by virtue of mathematical induction. Finally, plenty of numerical experiments are conducted on a wide variety of problems, which exhibit the effectiveness and the superiority of the proposed model over some existing ones.

摘要

相似文献

1
Universal Approximation Abilities of a Modular Differentiable Neural Network.
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):5586-5600. doi: 10.1109/TNNLS.2024.3378697. Epub 2025 Feb 28.
2
Parameterized Convex Universal Approximators for Decision-Making Problems.
IEEE Trans Neural Netw Learn Syst. 2024 Feb;35(2):2448-2459. doi: 10.1109/TNNLS.2022.3190198. Epub 2024 Feb 5.
3
A Universal Approximation Result for Difference of Log-Sum-Exp Neural Networks.对数和指数神经网络差的通用逼近结果。
IEEE Trans Neural Netw Learn Syst. 2020 Dec;31(12):5603-5612. doi: 10.1109/TNNLS.2020.2975051. Epub 2020 Nov 30.
4
Differentiable self-supervised clustering with intrinsic interpretability.具有内在可解释性的可微分自监督聚类。
Neural Netw. 2024 Nov;179:106542. doi: 10.1016/j.neunet.2024.106542. Epub 2024 Jul 24.
5
Smoothing neural network for L regularized optimization problem with general convex constraints.具有一般凸约束的 L 正则化优化问题的平滑神经网络。
Neural Netw. 2021 Nov;143:678-689. doi: 10.1016/j.neunet.2021.08.001. Epub 2021 Aug 8.
6
Extraction of rules from artificial neural networks for nonlinear regression.从人工神经网络中提取用于非线性回归的规则。
IEEE Trans Neural Netw. 2002;13(3):564-77. doi: 10.1109/TNN.2002.1000125.
7
Log-Sum-Exp Neural Networks and Posynomial Models for Convex and Log-Log-Convex Data.对数和模型神经网络和次指数模型在凸和对数-对数凸数据中的应用。
IEEE Trans Neural Netw Learn Syst. 2020 Mar;31(3):827-838. doi: 10.1109/TNNLS.2019.2910417. Epub 2019 May 15.
8
Fractional Approximation of Broad Learning System.
IEEE Trans Cybern. 2024 Feb;54(2):811-824. doi: 10.1109/TCYB.2021.3127152. Epub 2024 Jan 17.
9
Universal approximation using incremental constructive feedforward networks with random hidden nodes.使用具有随机隐藏节点的增量式构造前馈网络的通用逼近
IEEE Trans Neural Netw. 2006 Jul;17(4):879-892. doi: 10.1109/TNN.2006.875977.
10
Momentum-Net: Fast and Convergent Iterative Neural Network for Inverse Problems.动量网络:用于反问题的快速收敛迭代神经网络。
IEEE Trans Pattern Anal Mach Intell. 2023 Apr;45(4):4915-4931. doi: 10.1109/TPAMI.2020.3012955. Epub 2023 Mar 10.