Suppr超能文献

近端牛顿法求解非凸优化问题。

DC Proximal Newton for Nonconvex Optimization Problems.

出版信息

IEEE Trans Neural Netw Learn Syst. 2016 Mar;27(3):636-47. doi: 10.1109/TNNLS.2015.2418224. Epub 2015 Apr 21.

Abstract

We introduce a novel algorithm for solving learning problems where both the loss function and the regularizer are nonconvex but belong to the class of difference of convex (DC) functions. Our contribution is a new general purpose proximal Newton algorithm that is able to deal with such a situation. The algorithm consists in obtaining a descent direction from an approximation of the loss function and then in performing a line search to ensure a sufficient descent. A theoretical analysis is provided showing that the iterates of the proposed algorithm admit as limit points stationary points of the DC objective function. Numerical experiments show that our approach is more efficient than the current state of the art for a problem with a convex loss function and a nonconvex regularizer. We have also illustrated the benefit of our algorithm in high-dimensional transductive learning problem where both the loss function and regularizers are nonconvex.

摘要

我们引入了一种新的算法来解决学习问题,其中损失函数和正则项都是非凸的,但属于凸差分(DC)函数类。我们的贡献是一种新的通用近端牛顿算法,能够处理这种情况。该算法包括从损失函数的近似值中获得下降方向,然后进行线搜索以确保足够的下降。提供了理论分析,表明所提出的算法的迭代具有作为 DC 目标函数的驻点的极限点。数值实验表明,对于具有凸损失函数和非凸正则项的问题,我们的方法比当前的最先进方法更有效。我们还说明了我们的算法在损失函数和正则项都是非凸的高维传导学习问题中的好处。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验