Boţ Radu Ioan, Csetnek Ernö Robert, Nimana Nimit
1Faculty of Mathematics, University of Vienna, Oskar-Morgenstern-Platz 1, 1090 Vienna, Austria.
2Faculty of Mathematics and Computer Sciences, Babeş-Bolyai University, Str. M. Kogălniceanu nr. 1, 400084 Cluj-Napoca, Romania.
Optim Lett. 2018;12(1):17-33. doi: 10.1007/s11590-017-1158-1. Epub 2017 Jun 14.
We consider the problem of minimizing a smooth convex objective function subject to the set of minima of another differentiable convex function. In order to solve this problem, we propose an algorithm which combines the gradient method with a penalization technique. Moreover, we insert in our algorithm an inertial term, which is able to take advantage of the history of the iterates. We show weak convergence of the generated sequence of iterates to an optimal solution of the optimization problem, provided a condition expressed via the Fenchel conjugate of the constraint function is fulfilled. We also prove convergence for the objective function values to the optimal objective value. The convergence analysis carried out in this paper relies on the celebrated Opial Lemma and generalized Fejér monotonicity techniques. We illustrate the functionality of the method via a numerical experiment addressing image classification via support vector machines.
我们考虑在另一个可微凸函数的极小值集约束下,最小化一个光滑凸目标函数的问题。为了解决这个问题,我们提出一种将梯度法与惩罚技术相结合的算法。此外,我们在算法中引入一个惯性项,它能够利用迭代的历史信息。我们证明,只要满足一个通过约束函数的Fenchel共轭表示的条件,生成的迭代序列就弱收敛到优化问题的最优解。我们还证明目标函数值收敛到最优目标值。本文进行的收敛性分析依赖于著名的Opial引理和广义Fejér单调性技术。我们通过一个关于支持向量机图像分类的数值实验来说明该方法的功能。