Nalci Alican, Fedorov Igor, Al-Shoukairi Maher, Liu Thomas T, Rao Bhaskar D
Department of Electrical and Computer Engineering, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92093, USA.
Departments of Radiology, Psychiatry and Bioengineering, and UCSD Center for Functional MRI, University of California, San Diego, 9500 Gilman Drive,CAN La Jolla, CA 92093, USA.
IEEE Trans Signal Process. 2018 Jun 15;66(12):3124-3139. doi: 10.1109/tsp.2018.2824286. Epub 2018 Apr 6.
In this paper, we develop a Bayesian evidence maximization framework to solve the sparse non-negative least squares problem (S-NNLS). We introduce a family of probability densities referred to as the Rectified Gaussian Scale Mixture (R-GSM), to model the sparsity enforcing prior distribution for the signal of interest. The R-GSM prior encompasses a variety of heavy-tailed distributions such as the rectified Laplacian and rectified Student-t distributions with a proper choice of the mixing density. We utilize the hierarchical representation induced by the R-GSM prior and develop an evidence maximization framework based on the Expectation-Maximization (EM) algorithm. Using the EM-based method, we estimate the hyper-parameters and obtain a point estimate for the solution of interest. We refer to this proposed method as rectified Sparse Bayesian Learning (R-SBL). We provide four EM-based R-SBL variants that offer a range of options to trade-off computational complexity to the quality of the E-step computation. These methods include the Markov Chain Monte Carlo EM, linear minimum mean square estimation, approximate message passing and a diagonal approximation. Using numerical experiments, we show that the proposed R-SBL method outperforms existing S-NNLS solvers in terms of both signal and support recovery, and is very robust against the structure of the design matrix.
在本文中,我们开发了一种贝叶斯证据最大化框架来解决稀疏非负最小二乘问题(S-NNLS)。我们引入了一族称为修正高斯尺度混合(R-GSM)的概率密度,以对感兴趣信号的稀疏性增强先验分布进行建模。通过适当选择混合密度,R-GSM先验包含了各种重尾分布,如修正拉普拉斯分布和修正学生t分布。我们利用R-GSM先验诱导的层次表示,并基于期望最大化(EM)算法开发了一个证据最大化框架。使用基于EM的方法,我们估计超参数并获得感兴趣解的点估计。我们将这种提出的方法称为修正稀疏贝叶斯学习(R-SBL)。我们提供了四种基于EM的R-SBL变体,它们提供了一系列在计算复杂度和E步计算质量之间进行权衡的选项。这些方法包括马尔可夫链蒙特卡罗EM、线性最小均方估计、近似消息传递和对角近似。通过数值实验,我们表明所提出的R-SBL方法在信号和支撑恢复方面均优于现有的S-NNLS求解器,并且对设计矩阵的结构具有很强的鲁棒性。