Neural Comput. 2013 Aug;25(8):2172-98. doi: 10.1162/NECO_a_00379. Epub 2013 Apr 22.
Chandrasekaran, Parrilo, and Willsky (2012) proposed a convex optimization problem for graphical model selection in the presence of unobserved variables. This convex optimization problem aims to estimate an inverse covariance matrix that can be decomposed into a sparse matrix minus a low-rank matrix from sample data. Solving this convex optimization problem is very challenging, especially for large problems. In this letter, we propose two alternating direction methods for solving this problem. The first method is to apply the classic alternating direction method of multipliers to solve the problem as a consensus problem. The second method is a proximal gradient-based alternating-direction method of multipliers. Our methods take advantage of the special structure of the problem and thus can solve large problems very efficiently. A global convergence result is established for the proposed methods. Numerical results on both synthetic data and gene expression data show that our methods usually solve problems with 1 million variables in 1 to 2 minutes and are usually 5 to 35 times faster than a state-of-the-art Newton-CG proximal point algorithm.
钱德拉塞卡兰、帕里洛和威尔斯基(2012 年)提出了一个凸优化问题,用于在存在未观测变量的情况下进行图形模型选择。这个凸优化问题旨在从样本数据中估计一个可以分解为稀疏矩阵减去低秩矩阵的逆协方差矩阵。解决这个凸优化问题非常具有挑战性,特别是对于大型问题。在这封信中,我们提出了两种交替方向法来解决这个问题。第一种方法是应用经典的交替方向乘子法将问题作为共识问题来解决。第二种方法是基于近端梯度的交替方向乘子法。我们的方法利用了问题的特殊结构,因此可以非常有效地解决大型问题。我们为所提出的方法建立了一个全局收敛性结果。在合成数据和基因表达数据上的数值结果表明,我们的方法通常可以在 1 到 2 分钟内解决 100 万个变量的问题,并且通常比最先进的牛顿-CG 近端点算法快 5 到 35 倍。