• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过核相关熵和卷积神经网络学习科罗博夫函数

Learning Korobov Functions by Correntropy and Convolutional Neural Networks.

作者信息

Fang Zhiying, Mao Tong, Fan Jun

机构信息

Institute of Applied Mathematics, Shenzhen Polytechnic University, Shenzhen, Guangdong, China

Computer, Electrical and Mathematical Sciences and Engineering Division, King Abdullah University of Science and Technology, Thuwal 4700, Kingdom of Saudi Arabia

出版信息

Neural Comput. 2024 Mar 21;36(4):718-743. doi: 10.1162/neco_a_01650.

DOI:10.1162/neco_a_01650
PMID:38457767
Abstract

Combining information-theoretic learning with deep learning has gained significant attention in recent years, as it offers a promising approach to tackle the challenges posed by big data. However, the theoretical understanding of convolutional structures, which are vital to many structured deep learning models, remains incomplete. To partially bridge this gap, this letter aims to develop generalization analysis for deep convolutional neural network (CNN) algorithms using learning theory. Specifically, we focus on investigating robust regression using correntropy-induced loss functions derived from information-theoretic learning. Our analysis demonstrates an explicit convergence rate for deep CNN-based robust regression algorithms when the target function resides in the Korobov space. This study sheds light on the theoretical underpinnings of CNNs and provides a framework for understanding their performance and limitations.

摘要

近年来,将信息论学习与深度学习相结合受到了广泛关注,因为它为应对大数据带来的挑战提供了一种很有前景的方法。然而,对于许多结构化深度学习模型至关重要的卷积结构,其理论理解仍不完整。为了部分弥补这一差距,本文旨在利用学习理论对深度卷积神经网络(CNN)算法进行泛化分析。具体而言,我们专注于研究使用从信息论学习中导出的核相关损失函数进行稳健回归。我们的分析表明,当目标函数位于科罗布夫空间时,基于深度CNN的稳健回归算法具有明确的收敛速度。这项研究揭示了CNN的理论基础,并为理解其性能和局限性提供了一个框架。

相似文献

1
Learning Korobov Functions by Correntropy and Convolutional Neural Networks.通过核相关熵和卷积神经网络学习科罗博夫函数
Neural Comput. 2024 Mar 21;36(4):718-743. doi: 10.1162/neco_a_01650.
2
Generalization analysis of deep CNNs under maximum correntropy criterion.最大相关熵准则下深度卷积神经网络的泛化分析。
Neural Netw. 2024 Jun;174:106226. doi: 10.1016/j.neunet.2024.106226. Epub 2024 Mar 5.
3
Generalization Analysis of CNNs for Classification on Spheres.用于球体分类的卷积神经网络泛化分析
IEEE Trans Neural Netw Learn Syst. 2023 Sep;34(9):6200-6213. doi: 10.1109/TNNLS.2021.3134675. Epub 2023 Sep 1.
4
New Insights Into Learning With Correntropy-Based Regression.基于协方差的回归学习的新见解。
Neural Comput. 2021 Jan;33(1):157-173. doi: 10.1162/neco_a_01334. Epub 2020 Oct 20.
5
Theory of deep convolutional neural networks III: Approximating radial functions.深度卷积神经网络理论 III:逼近径向函数。
Neural Netw. 2021 Dec;144:778-790. doi: 10.1016/j.neunet.2021.09.027. Epub 2021 Oct 6.
6
CNN-Siam: multimodal siamese CNN-based deep learning approach for drug‒drug interaction prediction.CNN-Siam:基于双通道 CNN 的深度学习方法用于药物-药物相互作用预测。
BMC Bioinformatics. 2023 Mar 23;24(1):110. doi: 10.1186/s12859-023-05242-y.
7
The deep arbitrary polynomial chaos neural network or how Deep Artificial Neural Networks could benefit from data-driven homogeneous chaos theory.深度任意多项式混沌神经网络或深度人工神经网络如何从数据驱动的均匀混沌理论中受益。
Neural Netw. 2023 Sep;166:85-104. doi: 10.1016/j.neunet.2023.06.036. Epub 2023 Jul 10.
8
Theory of deep convolutional neural networks: Downsampling.深度卷积神经网络理论:下采样。
Neural Netw. 2020 Apr;124:319-327. doi: 10.1016/j.neunet.2020.01.018. Epub 2020 Jan 25.
9
A multimodal convolutional neuro-fuzzy network for emotion understanding of movie clips.用于电影片段情绪理解的多模态卷积神经模糊网络。
Neural Netw. 2019 Oct;118:208-219. doi: 10.1016/j.neunet.2019.06.010. Epub 2019 Jul 2.
10
Evaluating the Learning Procedure of CNNs through a Sequence of Prognostic Tests Utilising Information Theoretical Measures.利用信息理论测度,通过一系列预后测试评估卷积神经网络的学习过程。
Entropy (Basel). 2021 Dec 30;24(1):67. doi: 10.3390/e24010067.