Suppr超能文献

用于逆问题的等变神经网络。

Equivariant neural networks for inverse problems.

作者信息

Celledoni Elena, Ehrhardt Matthias J, Etmann Christian, Owren Brynjulf, Schönlieb Carola-Bibiane, Sherry Ferdia

机构信息

Department of Mathematical Sciences, NTNU, N-7491 Trondheim, Norway.

Institute for Mathematical Innovation, University of Bath, Bath BA2 7JU, United Kingdom.

出版信息

Inverse Probl. 2021 Aug;37(8):085006. doi: 10.1088/1361-6420/ac104f. Epub 2021 Jul 26.

Abstract

In recent years the use of convolutional layers to encode an inductive bias (translational equivariance) in neural networks has proven to be a very fruitful idea. The successes of this approach have motivated a line of research into incorporating other symmetries into deep learning methods, in the form of group equivariant convolutional neural networks. Much of this work has been focused on roto-translational symmetry of , but other examples are the scaling symmetry of and rotational symmetry of the sphere. In this work, we demonstrate that group equivariant convolutional operations can naturally be incorporated into learned reconstruction methods for inverse problems that are motivated by the variational regularisation approach. Indeed, if the regularisation functional is invariant under a group symmetry, the corresponding proximal operator will satisfy an equivariance property with respect to the same group symmetry. As a result of this observation, we design learned iterative methods in which the proximal operators are modelled as group equivariant convolutional neural networks. We use roto-translationally equivariant operations in the proposed methodology and apply it to the problems of low-dose computerised tomography reconstruction and subsampled magnetic resonance imaging reconstruction. The proposed methodology is demonstrated to improve the reconstruction quality of a learned reconstruction method with a little extra computational cost at training time but without any extra cost at test time.

摘要

近年来,在神经网络中使用卷积层来编码归纳偏差(平移等变性)已被证明是一个非常富有成效的想法。这种方法的成功推动了一系列研究,即以群等变卷积神经网络的形式将其他对称性纳入深度学习方法。这项工作大多集中于( \mathbb{R}^n )的旋转平移对称性,但其他例子包括( \mathbb{R}^n )的缩放对称性和球体的旋转对称性。在这项工作中,我们证明了群等变卷积运算可以自然地纳入到由变分正则化方法推动的逆问题的学习重建方法中。实际上,如果正则化泛函在群对称性下是不变的,那么相应的近端算子将相对于相同的群对称性满足等变性质。基于这一观察结果,我们设计了学习迭代方法,其中近端算子被建模为群等变卷积神经网络。我们在所提出的方法中使用旋转平移等变运算,并将其应用于低剂量计算机断层扫描重建和欠采样磁共振成像重建问题。结果表明,所提出的方法在训练时只需少量额外计算成本就能提高学习重建方法的重建质量,而在测试时无需任何额外成本。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b61d/8317019/c8c70b601600/ipac104ff1_hr.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验