Suppr超能文献

用于非凸无约束复合优化的具有全局收敛性分析的可扩展近端雅可比迭代方法

Scalable Proximal Jacobian Iteration Method With Global Convergence Analysis for Nonconvex Unconstrained Composite Optimizations.

作者信息

Zhang Hengmin, Qian Jianjun, Gao Junbin, Yang Jian, Xu Chunyan

出版信息

IEEE Trans Neural Netw Learn Syst. 2019 Sep;30(9):2825-2839. doi: 10.1109/TNNLS.2018.2885699. Epub 2019 Jan 15.

Abstract

The recent studies have found that the nonconvex relaxation functions usually perform better than the convex counterparts in the l -norm and rank function minimization problems. However, due to the absence of convexity in these nonconvex problems, developing efficient algorithms with convergence guarantee becomes very challenging. Inspired by the basic ideas of both the Jacobian alternating direction method of multipliers (JADMMs) for solving linearly constrained problems with separable objectives and the proximal gradient methods (PGMs) for optimizing the unconstrained problems with one variable, this paper focuses on extending the PGMs to the proximal Jacobian iteration methods (PJIMs) for handling with a family of nonconvex composite optimization problems with two splitting variables. To reduce the total computational complexity by decreasing the number of iterations, we devise the accelerated version of PJIMs through the well-known Nesterov's acceleration strategy and further extend both to solve the multivariable cases. Most importantly, we provide a rigorous convergence analysis, in theory, to show that the generated variable sequence globally converges to a critical point by exploiting the Kurdyka-Łojasiewica (KŁ) property for a broad class of functions. Furthermore, we also establish the linear and sublinear convergence rates of the obtained variable sequence in the objective function. As the specific application to the nonconvex sparse and low-rank recovery problems, several numerical experiments can verify that the newly proposed algorithms not only keep fast convergence speed but also have high precision.

摘要

最近的研究发现,在l -范数和秩函数最小化问题中,非凸松弛函数通常比凸松弛函数表现得更好。然而,由于这些非凸问题缺乏凸性,开发具有收敛保证的高效算法变得非常具有挑战性。受用于求解具有可分离目标的线性约束问题的雅可比乘子交替方向法(JADMMs)和用于优化单变量无约束问题的近端梯度法(PGMs)的基本思想启发,本文致力于将PGMs扩展为近端雅可比迭代方法(PJIMs),以处理一类具有两个分裂变量的非凸复合优化问题。为了通过减少迭代次数来降低总计算复杂度,我们通过著名的涅斯捷罗夫加速策略设计了PJIMs的加速版本,并进一步将两者扩展以求解多变量情况。最重要的是,我们在理论上提供了严格的收敛性分析,以表明通过利用一大类函数的库尔迪卡 - 洛贾谢维奇(KŁ)性质,生成的变量序列全局收敛到一个临界点。此外,我们还建立了目标函数中所得变量序列的线性和次线性收敛速率。作为对非凸稀疏和低秩恢复问题的具体应用,几个数值实验可以验证新提出的算法不仅保持快速收敛速度,而且具有高精度。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验