Suppr超能文献

一种基于高斯尺度混合的高效稀疏贝叶斯学习算法。

An Efficient Sparse Bayesian Learning Algorithm Based on Gaussian-Scale Mixtures.

作者信息

Zhou Wei, Zhang Hai-Tao, Wang Jun

出版信息

IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):3065-3078. doi: 10.1109/TNNLS.2020.3049056. Epub 2022 Jul 6.

Abstract

Sparse Bayesian learning (SBL) is a popular machine learning approach with a superior generalization capability due to the sparsity of its adopted model. However, it entails a matrix inversion at each iteration, hindering its practical applications with large-scale data sets. To overcome this bottleneck, we propose an efficient SBL algorithm with O(n) computational complexity per iteration based on a Gaussian-scale mixture prior model. By specifying two different hyperpriors, the proposed efficient SBL algorithm can meet two different requirements, such as high efficiency and high sparsity. A surrogate function is introduced herein to approximate the posterior density of model parameters and thereby to avoid matrix inversions. Using a data-dependent term, a joint cost function with separate penalty terms is reformulated in a joint space of model parameters and hyperparameters. The resulting nonconvex optimization problem is solved using a block coordinate descent method in a majorization-minimization framework. Finally, the results of extensive experiments for sparse signal recovery and sparse image reconstruction on benchmark problems are elaborated to substantiate the effectiveness and superiority of the proposed approach in terms of computational time and estimation error.

摘要

稀疏贝叶斯学习(SBL)是一种流行的机器学习方法,由于其采用的模型具有稀疏性,因此具有卓越的泛化能力。然而,它在每次迭代时都需要进行矩阵求逆,这阻碍了其在大规模数据集上的实际应用。为了克服这一瓶颈,我们基于高斯尺度混合先验模型提出了一种每次迭代计算复杂度为O(n)的高效SBL算法。通过指定两种不同的超先验,所提出的高效SBL算法可以满足两种不同的要求,如高效率和高稀疏性。本文引入了一个替代函数来近似模型参数的后验密度,从而避免矩阵求逆。使用一个数据依赖项,在模型参数和超参数的联合空间中重新构建了一个带有单独惩罚项的联合代价函数。在主元最小化框架下,使用块坐标下降法解决由此产生的非凸优化问题。最后,详细阐述了在基准问题上进行稀疏信号恢复和稀疏图像重建的大量实验结果,以证实所提方法在计算时间和估计误差方面的有效性和优越性。

相似文献

1
An Efficient Sparse Bayesian Learning Algorithm Based on Gaussian-Scale Mixtures.
IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):3065-3078. doi: 10.1109/TNNLS.2020.3049056. Epub 2022 Jul 6.
2
Sparse Bayesian Learning Based on Collaborative Neurodynamic Optimization.
IEEE Trans Cybern. 2022 Dec;52(12):13669-13683. doi: 10.1109/TCYB.2021.3090204. Epub 2022 Nov 18.
3
Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem.
IEEE Trans Signal Process. 2018 Jun 15;66(12):3124-3139. doi: 10.1109/tsp.2018.2824286. Epub 2018 Apr 6.
5
Hyperspectral Images Denoising via Nonconvex Regularized Low-Rank and Sparse Matrix Decomposition.
IEEE Trans Image Process. 2020;29:44-56. doi: 10.1109/TIP.2019.2926736. Epub 2019 Jul 12.
6
Fast Sparse Aperture ISAR Autofocusing and imaging via ADMM based Sparse Bayesian Learning.
IEEE Trans Image Process. 2019 Dec 11. doi: 10.1109/TIP.2019.2957939.
7
Sparse Aperture InISAR Imaging via Sequential Multiple Sparse Bayesian Learning.
Sensors (Basel). 2017 Oct 10;17(10):2295. doi: 10.3390/s17102295.
8
Fast and robust Block-Sparse Bayesian learning for EEG source imaging.
Neuroimage. 2018 Jul 1;174:449-462. doi: 10.1016/j.neuroimage.2018.03.048. Epub 2018 Mar 27.
10
Sparse Bayesian learning for DOA estimation with mutual coupling.
Sensors (Basel). 2015 Oct 16;15(10):26267-80. doi: 10.3390/s151026267.

引用本文的文献

1
Improved Variational Bayes for Space-Time Adaptive Processing.
Entropy (Basel). 2025 Feb 26;27(3):242. doi: 10.3390/e27030242.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验