IEEE Trans Neural Netw Learn Syst. 2016 Mar;27(3):636-47. doi: 10.1109/TNNLS.2015.2418224. Epub 2015 Apr 21.
We introduce a novel algorithm for solving learning problems where both the loss function and the regularizer are nonconvex but belong to the class of difference of convex (DC) functions. Our contribution is a new general purpose proximal Newton algorithm that is able to deal with such a situation. The algorithm consists in obtaining a descent direction from an approximation of the loss function and then in performing a line search to ensure a sufficient descent. A theoretical analysis is provided showing that the iterates of the proposed algorithm admit as limit points stationary points of the DC objective function. Numerical experiments show that our approach is more efficient than the current state of the art for a problem with a convex loss function and a nonconvex regularizer. We have also illustrated the benefit of our algorithm in high-dimensional transductive learning problem where both the loss function and regularizers are nonconvex.
我们引入了一种新的算法来解决学习问题,其中损失函数和正则项都是非凸的,但属于凸差分(DC)函数类。我们的贡献是一种新的通用近端牛顿算法,能够处理这种情况。该算法包括从损失函数的近似值中获得下降方向,然后进行线搜索以确保足够的下降。提供了理论分析,表明所提出的算法的迭代具有作为 DC 目标函数的驻点的极限点。数值实验表明,对于具有凸损失函数和非凸正则项的问题,我们的方法比当前的最先进方法更有效。我们还说明了我们的算法在损失函数和正则项都是非凸的高维传导学习问题中的好处。