• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于尖峰和平板先验学习稀疏深度神经网络。

Learning Sparse Deep Neural Networks with a Spike-and-Slab Prior.

作者信息

Sun Yan, Song Qifan, Liang Faming

机构信息

Department of Statistics, Purdue University, West Lafayette, IN 47907, USA.

出版信息

Stat Probab Lett. 2022 Jan;180. doi: 10.1016/j.spl.2021.109246. Epub 2021 Sep 24.

DOI:10.1016/j.spl.2021.109246
PMID:34744226
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8570537/
Abstract

Deep learning has achieved great successes in many machine learning tasks. However, the deep neural networks (DNNs) are often severely over-parameterized, making them computationally expensive, memory intensive, less interpretable and mis-calibrated. We study sparse DNNs under the Bayesian framework: we establish posterior consistency and structure selection consistency for Bayesian DNNs with a spike-and-slab prior, and illustrate their performance using examples on high-dimensional nonlinear variable selection, large network compression and model calibration. Our numerical results indicate that sparsity is essential for improving the prediction accuracy and calibration of the DNN.

摘要

深度学习在许多机器学习任务中取得了巨大成功。然而,深度神经网络(DNN)往往参数严重过多,这使得它们计算成本高昂、内存密集、难以解释且校准错误。我们在贝叶斯框架下研究稀疏DNN:我们为具有尖峰和平板先验的贝叶斯DNN建立后验一致性和结构选择一致性,并通过高维非线性变量选择、大型网络压缩和模型校准的示例来说明它们的性能。我们的数值结果表明,稀疏性对于提高DNN的预测准确性和校准至关重要。

相似文献

1
Learning Sparse Deep Neural Networks with a Spike-and-Slab Prior.基于尖峰和平板先验学习稀疏深度神经网络。
Stat Probab Lett. 2022 Jan;180. doi: 10.1016/j.spl.2021.109246. Epub 2021 Sep 24.
2
Consistent Sparse Deep Learning: Theory and Computation.一致稀疏深度学习:理论与计算
J Am Stat Assoc. 2022;117(540):1981-1995. doi: 10.1080/01621459.2021.1895175. Epub 2021 Apr 20.
3
Layer adaptive node selection in Bayesian neural networks: Statistical guarantees and implementation details.贝叶斯神经网络中的层自适应节点选择:统计保证与实现细节。
Neural Netw. 2023 Oct;167:309-330. doi: 10.1016/j.neunet.2023.08.029. Epub 2023 Aug 22.
4
Transformed ℓ regularization for learning sparse deep neural networks.ℓ 正则化变换在稀疏深度神经网络学习中的应用。
Neural Netw. 2019 Nov;119:286-298. doi: 10.1016/j.neunet.2019.08.015. Epub 2019 Aug 27.
5
Symbolic Deep Networks: A Psychologically Inspired Lightweight and Efficient Approach to Deep Learning.符号深度学习网络:一种受心理学启发的轻量级高效深度学习方法。
Top Cogn Sci. 2022 Oct;14(4):702-717. doi: 10.1111/tops.12571. Epub 2021 Oct 5.
6
SSGD: SPARSITY-PROMOTING STOCHASTIC GRADIENT DESCENT ALGORITHM FOR UNBIASED DNN PRUNING.SSGD:用于无偏深度神经网络剪枝的稀疏性促进随机梯度下降算法
Proc IEEE Int Conf Acoust Speech Signal Process. 2020 May;2020:5410-5414. doi: 10.1109/icassp40776.2020.9054436. Epub 2020 May 14.
7
Perturbation of deep autoencoder weights for model compression and classification of tabular data.扰动深度自动编码器权重以进行模型压缩和表格数据分类。
Neural Netw. 2022 Dec;156:160-169. doi: 10.1016/j.neunet.2022.09.020. Epub 2022 Sep 27.
8
DropConnect is effective in modeling uncertainty of Bayesian deep networks.DropConnect 在对贝叶斯深度网络的不确定性建模方面非常有效。
Sci Rep. 2021 Mar 9;11(1):5458. doi: 10.1038/s41598-021-84854-x.
9
Synergistic Integration of Deep Neural Networks and Finite Element Method with Applications of Nonlinear Large Deformation Biomechanics.深度神经网络与有限元方法在非线性大变形生物力学中的协同集成及应用
Comput Methods Appl Mech Eng. 2023 Nov 1;416. doi: 10.1016/j.cma.2023.116347. Epub 2023 Aug 22.
10
Density regression and uncertainty quantification with Bayesian deep noise neural networks.基于贝叶斯深度噪声神经网络的密度回归与不确定性量化
Stat. 2023 Jan-Dec;12(1). doi: 10.1002/sta4.604. Epub 2023 Aug 1.

引用本文的文献

1
VBayesMM: variational Bayesian neural network to prioritize important relationships of high-dimensional microbiome multiomics data.VBayesMM:用于对高维微生物组多组学数据的重要关系进行优先级排序的变分贝叶斯神经网络。
Brief Bioinform. 2025 Jul 2;26(4). doi: 10.1093/bib/bbaf300.
2
A New Paradigm for Generative Adversarial Networks based on Randomized Decision Rules.基于随机决策规则的生成对抗网络新范式。
Stat Sin. 2025 Apr;35(2):897-918. doi: 10.5705/ss.202022.0404.

本文引用的文献

1
Consistent Sparse Deep Learning: Theory and Computation.一致稀疏深度学习:理论与计算
J Am Stat Assoc. 2022;117(540):1981-1995. doi: 10.1080/01621459.2021.1895175. Epub 2021 Apr 20.
2
Extended Stochastic Gradient MCMC for Large-Scale Bayesian Variable Selection.用于大规模贝叶斯变量选择的扩展随机梯度马尔可夫链蒙特卡罗方法
Biometrika. 2020 Dec;107(4):997-1004. doi: 10.1093/biomet/asaa029. Epub 2020 Jul 13.
3
Bayesian Neural Networks for Selection of Drug Sensitive Genes.用于选择药物敏感基因的贝叶斯神经网络
J Am Stat Assoc. 2018;113(523):955-972. doi: 10.1080/01621459.2017.1409122. Epub 2018 Jun 28.