Suppr超能文献

用于求解具有光滑数据的约束凸优化问题的带惯性效应的梯度型罚函数法

Gradient-type penalty method with inertial effects for solving constrained convex optimization problems with smooth data.

作者信息

Boţ Radu Ioan, Csetnek Ernö Robert, Nimana Nimit

机构信息

1Faculty of Mathematics, University of Vienna, Oskar-Morgenstern-Platz 1, 1090 Vienna, Austria.

2Faculty of Mathematics and Computer Sciences, Babeş-Bolyai University, Str. M. Kogălniceanu nr. 1, 400084 Cluj-Napoca, Romania.

出版信息

Optim Lett. 2018;12(1):17-33. doi: 10.1007/s11590-017-1158-1. Epub 2017 Jun 14.

Abstract

We consider the problem of minimizing a smooth convex objective function subject to the set of minima of another differentiable convex function. In order to solve this problem, we propose an algorithm which combines the gradient method with a penalization technique. Moreover, we insert in our algorithm an inertial term, which is able to take advantage of the history of the iterates. We show weak convergence of the generated sequence of iterates to an optimal solution of the optimization problem, provided a condition expressed via the Fenchel conjugate of the constraint function is fulfilled. We also prove convergence for the objective function values to the optimal objective value. The convergence analysis carried out in this paper relies on the celebrated Opial Lemma and generalized Fejér monotonicity techniques. We illustrate the functionality of the method via a numerical experiment addressing image classification via support vector machines.

摘要

我们考虑在另一个可微凸函数的极小值集约束下,最小化一个光滑凸目标函数的问题。为了解决这个问题,我们提出一种将梯度法与惩罚技术相结合的算法。此外,我们在算法中引入一个惯性项,它能够利用迭代的历史信息。我们证明,只要满足一个通过约束函数的Fenchel共轭表示的条件,生成的迭代序列就弱收敛到优化问题的最优解。我们还证明目标函数值收敛到最优目标值。本文进行的收敛性分析依赖于著名的Opial引理和广义Fejér单调性技术。我们通过一个关于支持向量机图像分类的数值实验来说明该方法的功能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4102/6956900/992302b53033/11590_2017_1158_Fig1_HTML.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验