Schwab Johannes, Antholzer Stephan, Haltmeier Markus
Department of Mathematics, University of Innsbruck, Technikerstrasse 13, 6020 Innsbruck, Austria.
J Math Imaging Vis. 2020;62(3):445-455. doi: 10.1007/s10851-019-00911-1. Epub 2019 Oct 3.
Deep learning and (deep) neural networks are emerging tools to address inverse problems and image reconstruction tasks. Despite outstanding performance, the mathematical analysis for solving inverse problems by neural networks is mostly missing. In this paper, we introduce and rigorously analyze families of deep regularizing neural networks (RegNets) of the form , where is a classical regularization and the network is trained to recover the missing part not found by the classical regularization. We show that these regularizing networks yield a convergent regularization method for solving inverse problems. Additionally, we derive convergence rates (quantitative error estimates) assuming a sufficient decay of the associated distance function. We demonstrate that our results recover existing convergence and convergence rates results for filter-based regularization methods as well as the recently introduced null space network as special cases. Numerical results are presented for a tomographic sparse data problem, which clearly demonstrate that the proposed RegNets improve classical regularization as well as the null space network.
深度学习和(深度)神经网络是解决逆问题和图像重建任务的新兴工具。尽管性能卓越,但通过神经网络解决逆问题的数学分析大多缺失。在本文中,我们引入并严格分析了形式为 的深度正则化神经网络(RegNets)族,其中 是一种经典正则化,并且网络 被训练来恢复经典正则化未找到的缺失部分 。我们表明,这些正则化网络产生了一种用于解决逆问题的收敛正则化方法。此外,在假设相关距离函数充分衰减的情况下,我们推导了收敛速率(定量误差估计)。我们证明,我们的结果恢复了基于滤波器的正则化方法以及最近引入的零空间网络的现有收敛和收敛速率结果,作为特殊情况。针对断层扫描稀疏数据问题给出了数值结果,这些结果清楚地表明所提出的RegNets改进了经典正则化以及零空间网络。