• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

线性微调:一种基于线性变换的深度磁共振成像重建迁移策略。

Linear fine-tuning: a linear transformation based transfer strategy for deep MRI reconstruction.

作者信息

Bi Wanqing, Xv Jianan, Song Mengdie, Hao Xiaohan, Gao Dayong, Qi Fulang

机构信息

The Centers for Biomedical Engineering, University of Science and Technology of China, Hefei, Anhui, China.

Fuqing Medical Co., Ltd., Hefei, Anhui, China.

出版信息

Front Neurosci. 2023 Jun 20;17:1202143. doi: 10.3389/fnins.2023.1202143. eCollection 2023.

DOI:10.3389/fnins.2023.1202143
PMID:37409107
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10318193/
Abstract

INTRODUCTION

Fine-tuning (FT) is a generally adopted transfer learning method for deep learning-based magnetic resonance imaging (MRI) reconstruction. In this approach, the reconstruction model is initialized with pre-trained weights derived from a source domain with ample data and subsequently updated with limited data from the target domain. However, the direct full-weight update strategy can pose the risk of "catastrophic forgetting" and overfitting, hindering its effectiveness. The goal of this study is to develop a zero-weight update transfer strategy to preserve pre-trained generic knowledge and reduce overfitting.

METHODS

Based on the commonality between the source and target domains, we assume a linear transformation relationship of the optimal model weights from the source domain to the target domain. Accordingly, we propose a novel transfer strategy, linear fine-tuning (LFT), which introduces scaling and shifting (SS) factors into the pre-trained model. In contrast to FT, LFT only updates SS factors in the transfer phase, while the pre-trained weights remain fixed.

RESULTS

To evaluate the proposed LFT, we designed three different transfer scenarios and conducted a comparative analysis of FT, LFT, and other methods at various sampling rates and data volumes. In the transfer scenario between different contrasts, LFT outperforms typical transfer strategies at various sampling rates and considerably reduces artifacts on reconstructed images. In transfer scenarios between different slice directions or anatomical structures, LFT surpasses the FT method, particularly when the target domain contains a decreasing number of training images, with a maximum improvement of up to 2.06 dB (5.89%) in peak signal-to-noise ratio.

DISCUSSION

The LFT strategy shows great potential to address the issues of "catastrophic forgetting" and overfitting in transfer scenarios for MRI reconstruction, while reducing the reliance on the amount of data in the target domain. Linear fine-tuning is expected to shorten the development cycle of reconstruction models for adapting complicated clinical scenarios, thereby enhancing the clinical applicability of deep MRI reconstruction.

摘要

引言

微调(FT)是一种广泛应用于基于深度学习的磁共振成像(MRI)重建的迁移学习方法。在这种方法中,重建模型使用来自具有丰富数据的源域的预训练权重进行初始化,随后使用来自目标域的有限数据进行更新。然而,直接的全权重更新策略可能会带来“灾难性遗忘”和过拟合的风险,从而阻碍其有效性。本研究的目标是开发一种零权重更新迁移策略,以保留预训练的通用知识并减少过拟合。

方法

基于源域和目标域之间的共性,我们假设从源域到目标域的最优模型权重存在线性变换关系。据此,我们提出了一种新颖的迁移策略,即线性微调(LFT),它将缩放和偏移(SS)因子引入到预训练模型中。与FT不同,LFT在迁移阶段仅更新SS因子,而预训练权重保持不变。

结果

为了评估所提出的LFT,我们设计了三种不同的迁移场景,并在各种采样率和数据量下对FT、LFT和其他方法进行了对比分析。在不同对比度之间的迁移场景中,LFT在各种采样率下均优于典型的迁移策略,并显著减少了重建图像上的伪影。在不同切片方向或解剖结构之间的迁移场景中,LFT超过了FT方法,特别是当目标域中的训练图像数量减少时,峰值信噪比最多可提高2.06 dB(5.89%)。

讨论

LFT策略在解决MRI重建迁移场景中的“灾难性遗忘”和过拟合问题方面显示出巨大潜力,同时减少了对目标域数据量的依赖。线性微调有望缩短适应复杂临床场景的重建模型的开发周期,从而提高深度MRI重建的临床适用性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/7cc5bb40c0fa/fnins-17-1202143-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/c47c6c3a3152/fnins-17-1202143-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/fd0b39cf2438/fnins-17-1202143-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/257aa8e7808e/fnins-17-1202143-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/2d652e148b56/fnins-17-1202143-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/b727fc7b326c/fnins-17-1202143-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/9cb22353a560/fnins-17-1202143-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/6f141a56788c/fnins-17-1202143-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/7cc5bb40c0fa/fnins-17-1202143-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/c47c6c3a3152/fnins-17-1202143-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/fd0b39cf2438/fnins-17-1202143-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/257aa8e7808e/fnins-17-1202143-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/2d652e148b56/fnins-17-1202143-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/b727fc7b326c/fnins-17-1202143-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/9cb22353a560/fnins-17-1202143-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/6f141a56788c/fnins-17-1202143-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02bf/10318193/7cc5bb40c0fa/fnins-17-1202143-g0008.jpg

相似文献

1
Linear fine-tuning: a linear transformation based transfer strategy for deep MRI reconstruction.线性微调:一种基于线性变换的深度磁共振成像重建迁移策略。
Front Neurosci. 2023 Jun 20;17:1202143. doi: 10.3389/fnins.2023.1202143. eCollection 2023.
2
Autoencoder and restricted Boltzmann machine for transfer learning in functional magnetic resonance imaging task classification.用于功能磁共振成像任务分类中迁移学习的自动编码器和受限玻尔兹曼机
Heliyon. 2023 Jul 16;9(7):e18086. doi: 10.1016/j.heliyon.2023.e18086. eCollection 2023 Jul.
3
Transfer learning in deep neural network based under-sampled MR image reconstruction.基于深度神经网络的欠采样磁共振图像重建中的迁移学习。
Magn Reson Imaging. 2021 Feb;76:96-107. doi: 10.1016/j.mri.2020.09.018. Epub 2020 Sep 24.
4
Enhancing Domain Diversity of Transfer Learning-Based SSVEP-BCIs by the Reconstruction of Channel Correlation.通过重建通道相关性增强基于迁移学习的稳态视觉诱发电位脑机接口的域多样性
IEEE Trans Biomed Eng. 2025 Feb;72(2):503-514. doi: 10.1109/TBME.2024.3458389. Epub 2025 Jan 21.
5
Brain tumor classification for MR images using transfer learning and fine-tuning.基于迁移学习和微调的磁共振图像脑肿瘤分类。
Comput Med Imaging Graph. 2019 Jul;75:34-46. doi: 10.1016/j.compmedimag.2019.05.001. Epub 2019 May 18.
6
A Transfer-Learning Approach for Accelerated MRI Using Deep Neural Networks.一种使用深度神经网络加速磁共振成像的迁移学习方法。
Magn Reson Med. 2020 Aug;84(2):663-685. doi: 10.1002/mrm.28148. Epub 2020 Jan 3.
7
MRI super-resolution reconstruction for MRI-guided adaptive radiotherapy using cascaded deep learning: In the presence of limited training data and unknown translation model.基于级联深度学习的 MRI 引导自适应放疗中 MRI 超分辨率重建:在有限的训练数据和未知的平移模型的情况下。
Med Phys. 2019 Sep;46(9):4148-4164. doi: 10.1002/mp.13717. Epub 2019 Aug 7.
8
Adaptive multi-source domain collaborative fine-tuning for transfer learning.用于迁移学习的自适应多源域协同微调
PeerJ Comput Sci. 2024 Jun 21;10:e2107. doi: 10.7717/peerj-cs.2107. eCollection 2024.
9
Transfer-learning is a key ingredient to fast deep learning-based 4D liver MRI reconstruction.迁移学习是快速基于深度学习的 4D 肝脏 MRI 重建的关键组成部分。
Sci Rep. 2023 Jul 11;13(1):11227. doi: 10.1038/s41598-023-38073-1.
10
Unsupervised Domain Adaptation for Image Classification and Object Detection Using Guided Transfer Learning Approach and JS Divergence.基于引导迁移学习方法和 JS 散度的图像分类和目标检测的无监督域自适应
Sensors (Basel). 2023 Apr 30;23(9):4436. doi: 10.3390/s23094436.

本文引用的文献

1
C -GAN: Content-consistent generative adversarial networks for unsupervised domain adaptation in medical image segmentation.C-GAN:用于医学图像分割中无监督域自适应的内容一致生成对抗网络。
Med Phys. 2022 Oct;49(10):6491-6504. doi: 10.1002/mp.15944. Epub 2022 Aug 27.
2
Transfer learning enhanced generative adversarial networks for multi-channel MRI reconstruction.基于迁移学习的生成对抗网络在多通道 MRI 重建中的应用。
Comput Biol Med. 2021 Jul;134:104504. doi: 10.1016/j.compbiomed.2021.104504. Epub 2021 May 26.
3
Which GAN? A comparative study of generative adversarial network-based fast MRI reconstruction.
哪种 GAN?基于生成对抗网络的快速 MRI 重建的比较研究。
Philos Trans A Math Phys Eng Sci. 2021 Jun 28;379(2200):20200203. doi: 10.1098/rsta.2020.0203. Epub 2021 May 10.
4
Novel Transfer Learning Approach for Medical Imaging with Limited Labeled Data.用于有限标注数据的医学成像的新型迁移学习方法。
Cancers (Basel). 2021 Mar 30;13(7):1590. doi: 10.3390/cancers13071590.
5
Analysis of deep complex-valued convolutional neural networks for MRI reconstruction and phase-focused applications.用于磁共振成像(MRI)重建和相位聚焦应用的深度复值卷积神经网络分析
Magn Reson Med. 2021 Aug;86(2):1093-1109. doi: 10.1002/mrm.28733. Epub 2021 Mar 16.
6
Targeted transfer learning to improve performance in small medical physics datasets.靶向迁移学习以提高小型医学物理数据集的性能。
Med Phys. 2020 Dec;47(12):6246-6256. doi: 10.1002/mp.14507. Epub 2020 Oct 25.
7
Transfer learning in deep neural network based under-sampled MR image reconstruction.基于深度神经网络的欠采样磁共振图像重建中的迁移学习。
Magn Reson Imaging. 2021 Feb;76:96-107. doi: 10.1016/j.mri.2020.09.018. Epub 2020 Sep 24.
8
Fine-Tuning U-Net for Ultrasound Image Segmentation: Different Layers, Different Outcomes.微调 U-Net 进行超声图像分割:不同的层,不同的结果。
IEEE Trans Ultrason Ferroelectr Freq Control. 2020 Dec;67(12):2510-2518. doi: 10.1109/TUFFC.2020.3015081. Epub 2020 Nov 24.
9
Subsampled brain MRI reconstruction by generative adversarial neural networks.基于生成对抗神经网络的亚采样脑 MRI 重建。
Med Image Anal. 2020 Oct;65:101747. doi: 10.1016/j.media.2020.101747. Epub 2020 Jun 11.
10
On instabilities of deep learning in image reconstruction and the potential costs of AI.深度学习在图像重建中的不稳定性及人工智能的潜在代价
Proc Natl Acad Sci U S A. 2020 Dec 1;117(48):30088-30095. doi: 10.1073/pnas.1907377117. Epub 2020 May 11.