Celledoni Elena, Ehrhardt Matthias J, Etmann Christian, Owren Brynjulf, Schönlieb Carola-Bibiane, Sherry Ferdia
Department of Mathematical Sciences, NTNU, N-7491 Trondheim, Norway.
Institute for Mathematical Innovation, University of Bath, Bath BA2 7JU, United Kingdom.
Inverse Probl. 2021 Aug;37(8):085006. doi: 10.1088/1361-6420/ac104f. Epub 2021 Jul 26.
In recent years the use of convolutional layers to encode an inductive bias (translational equivariance) in neural networks has proven to be a very fruitful idea. The successes of this approach have motivated a line of research into incorporating other symmetries into deep learning methods, in the form of group equivariant convolutional neural networks. Much of this work has been focused on roto-translational symmetry of , but other examples are the scaling symmetry of and rotational symmetry of the sphere. In this work, we demonstrate that group equivariant convolutional operations can naturally be incorporated into learned reconstruction methods for inverse problems that are motivated by the variational regularisation approach. Indeed, if the regularisation functional is invariant under a group symmetry, the corresponding proximal operator will satisfy an equivariance property with respect to the same group symmetry. As a result of this observation, we design learned iterative methods in which the proximal operators are modelled as group equivariant convolutional neural networks. We use roto-translationally equivariant operations in the proposed methodology and apply it to the problems of low-dose computerised tomography reconstruction and subsampled magnetic resonance imaging reconstruction. The proposed methodology is demonstrated to improve the reconstruction quality of a learned reconstruction method with a little extra computational cost at training time but without any extra cost at test time.
近年来,在神经网络中使用卷积层来编码归纳偏差(平移等变性)已被证明是一个非常富有成效的想法。这种方法的成功推动了一系列研究,即以群等变卷积神经网络的形式将其他对称性纳入深度学习方法。这项工作大多集中于( \mathbb{R}^n )的旋转平移对称性,但其他例子包括( \mathbb{R}^n )的缩放对称性和球体的旋转对称性。在这项工作中,我们证明了群等变卷积运算可以自然地纳入到由变分正则化方法推动的逆问题的学习重建方法中。实际上,如果正则化泛函在群对称性下是不变的,那么相应的近端算子将相对于相同的群对称性满足等变性质。基于这一观察结果,我们设计了学习迭代方法,其中近端算子被建模为群等变卷积神经网络。我们在所提出的方法中使用旋转平移等变运算,并将其应用于低剂量计算机断层扫描重建和欠采样磁共振成像重建问题。结果表明,所提出的方法在训练时只需少量额外计算成本就能提高学习重建方法的重建质量,而在测试时无需任何额外成本。