Suppr超能文献

用于高阶全变差正则化模型的双层参数学习

Bilevel Parameter Learning for Higher-Order Total Variation Regularisation Models.

作者信息

De Los Reyes J C, Schönlieb C-B, Valkonen T

机构信息

1Research Center on Mathematical Modelling (MODEMAT), Escuela Politécnica Nacional, Quito, Ecuador.

2Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge, UK.

出版信息

J Math Imaging Vis. 2017;57(1):1-25. doi: 10.1007/s10851-016-0662-8. Epub 2016 Jun 1.

Abstract

We consider a bilevel optimisation approach for parameter learning in higher-order total variation image reconstruction models. Apart from the least squares cost functional, naturally used in bilevel learning, we propose and analyse an alternative cost based on a Huber-regularised TV seminorm. Differentiability properties of the solution operator are verified and a first-order optimality system is derived. Based on the adjoint information, a combined quasi-Newton/semismooth Newton algorithm is proposed for the numerical solution of the bilevel problems. Numerical experiments are carried out to show the suitability of our approach and the improved performance of the new cost functional. Thanks to the bilevel optimisation framework, also a detailed comparison between and is carried out, showing the advantages and shortcomings of both regularisers, depending on the structure of the processed images and their noise level.

摘要

我们考虑一种用于高阶全变差图像重建模型中参数学习的双层优化方法。除了在双层学习中自然使用的最小二乘代价泛函外,我们还提出并分析了一种基于Huber正则化TV半范数的替代代价。验证了解算子的可微性性质,并推导了一阶最优性系统。基于伴随信息,提出了一种组合拟牛顿/半光滑牛顿算法用于双层问题的数值求解。进行了数值实验以展示我们方法的适用性以及新代价泛函的改进性能。得益于双层优化框架,还对[此处原文缺失相关比较对象]和[此处原文缺失相关比较对象]进行了详细比较,展示了两种正则化器的优缺点,这取决于所处理图像的结构及其噪声水平。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7397/7175605/5cfd04f270a2/10851_2016_662_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验