Suppr超能文献

用于加速磁共振成像的密集递归神经网络:优化算法的历史感知展开

Dense Recurrent Neural Networks for Accelerated MRI: History-Cognizant Unrolling of Optimization Algorithms.

作者信息

Hosseini Seyed Amir Hossein, Yaman Burhaneddin, Moeller Steen, Hong Mingyi, Akçakaya Mehmet

机构信息

Department of Electrical and Computer Engineering, and Center for Magnetic Resonance Research, University of Minnesota, Minneapolis, MN, 55455.

Center for Magnetic Resonance Research, University of Minnesota, Minneapolis, MN, 55455.

出版信息

IEEE J Sel Top Signal Process. 2020 Oct;14(6):1280-1291. doi: 10.1109/jstsp.2020.3003170. Epub 2020 Jun 17.

Abstract

Inverse problems for accelerated MRI typically incorporate domain-specific knowledge about the forward encoding operator in a regularized reconstruction framework. Recently physics-driven deep learning (DL) methods have been proposed to use neural networks for data-driven regularization. These methods unroll iterative optimization algorithms to solve the inverse problem objective function, by alternating between domain-specific data consistency and data-driven regularization via neural networks. The whole unrolled network is then trained end-to-end to learn the parameters of the network. Due to simplicity of data consistency updates with gradient descent steps, proximal gradient descent (PGD) is a common approach to unroll physics-driven DL reconstruction methods. However, PGD methods have slow convergence rates, necessitating a higher number of unrolled iterations, leading to memory issues in training and slower reconstruction times in testing. Inspired by efficient variants of PGD methods that use a history of the previous iterates, we propose a history-cognizant unrolling of the optimization algorithm with dense connections across iterations for improved performance. In our approach, the gradient descent steps are calculated at a trainable combination of the outputs of all the previous regularization units. We also apply this idea to unrolling variable splitting methods with quadratic relaxation. Our results in reconstruction of the fastMRI knee dataset show that the proposed history-cognizant approach reduces residual aliasing artifacts compared to its conventional unrolled counterpart without requiring extra computational power or increasing reconstruction time.

摘要

加速磁共振成像的逆问题通常在正则化重建框架中纳入关于正向编码算子的特定领域知识。最近,有人提出了物理驱动的深度学习(DL)方法,使用神经网络进行数据驱动的正则化。这些方法展开迭代优化算法来求解逆问题目标函数,通过在特定领域的数据一致性和经由神经网络的数据驱动正则化之间交替进行。然后对整个展开的网络进行端到端训练,以学习网络参数。由于使用梯度下降步骤进行数据一致性更新很简单,近端梯度下降(PGD)是展开物理驱动的DL重建方法的常用方法。然而,PGD方法的收敛速度较慢,需要更多的展开迭代次数,这导致训练中的内存问题以及测试中重建时间变长。受使用先前迭代历史的PGD方法有效变体的启发,我们提出了一种对优化算法进行历史感知展开的方法,在各次迭代间使用密集连接以提高性能。在我们的方法中,梯度下降步骤是在所有先前正则化单元输出的可训练组合处计算的。我们还将此想法应用于具有二次松弛的展开变量分裂方法。我们在快速磁共振成像膝关节数据集重建中的结果表明,与传统的展开对应方法相比,所提出的历史感知方法减少了残余混叠伪影,而无需额外的计算能力或增加重建时间。

相似文献

4
Equilibrated Zeroth-Order Unrolled Deep Network for Parallel MR Imaging.平衡零阶展开深度网络用于并行磁共振成像。
IEEE Trans Med Imaging. 2023 Dec;42(12):3540-3554. doi: 10.1109/TMI.2023.3293826. Epub 2023 Nov 30.

引用本文的文献

2
Learning Task-Specific Strategies for Accelerated MRI.学习用于加速磁共振成像的特定任务策略。
IEEE Trans Comput Imaging. 2024;10:1040-1054. doi: 10.1109/tci.2024.3410521. Epub 2024 Jul 1.
5
Interpretable deep learning for deconvolutional analysis of neural signals.用于神经信号反卷积分析的可解释深度学习
Neuron. 2025 Apr 16;113(8):1151-1168.e13. doi: 10.1016/j.neuron.2025.02.006. Epub 2025 Mar 12.

本文引用的文献

7
ACCELERATING MAGNETIC RESONANCE IMAGING VIA DEEP LEARNING.通过深度学习加速磁共振成像
Proc IEEE Int Symp Biomed Imaging. 2016 Apr;2016:514-517. doi: 10.1109/ISBI.2016.7493320. Epub 2016 Jun 16.
8
k -Space Deep Learning for Accelerated MRI.k-空间深度学习加速磁共振成像。
IEEE Trans Med Imaging. 2020 Feb;39(2):377-386. doi: 10.1109/TMI.2019.2927101. Epub 2019 Jul 5.
9
ADMM-CSNet: A Deep Learning Approach for Image Compressive Sensing.ADMM-CSNet:一种用于图像压缩感知的深度学习方法。
IEEE Trans Pattern Anal Mach Intell. 2020 Mar;42(3):521-538. doi: 10.1109/TPAMI.2018.2883941. Epub 2018 Nov 28.

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验