文献检索文档翻译深度研究
Suppr Zotero 插件Zotero 插件
邀请有礼套餐&价格历史记录

新学期,新优惠

限时优惠:9月1日-9月22日

30天高级会员仅需29元

1天体验卡首发特惠仅需5.99元

了解详情
不再提醒
插件&应用
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
高级版
套餐订阅购买积分包
AI 工具
文献检索文档翻译深度研究
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2025

基于生成对抗网络的乳腺 MRI 中纤维腺体组织的自动分割。

Automated fibroglandular tissue segmentation in breast MRI using generative adversarial networks.

机构信息

School of Data and Computer Science, Sun Yat-Sen University, Guangzhou, People's Republic of China. Guangdong Province Key Laboratory Computational Science, Sun Yat-Sen University, Guangzhou, People's Republic of China.

出版信息

Phys Med Biol. 2020 May 19;65(10):105006. doi: 10.1088/1361-6560/ab7e7f.


DOI:10.1088/1361-6560/ab7e7f
PMID:32155611
Abstract

Fibroglandular tissue (FGT) segmentation is a crucial step for quantitative analysis of background parenchymal enhancement (BPE) in magnetic resonance imaging (MRI), which is useful for breast cancer risk assessment. In this study, we develop an automated deep learning method based on a generative adversarial network (GAN) to identify the FGT region in MRI volumes and evaluate its impact on a specific clinical application. The GAN consists of an improved U-Net as a generator to generate FGT candidate areas and a patch deep convolutional neural network (DCNN) as a discriminator to evaluate the authenticity of the synthetic FGT region. The proposed method has two improvements compared to the classical U-Net: (1) the improved U-Net is designed to extract more features of the FGT region for a more accurate description of the FGT region; (2) a patch DCNN is designed for discriminating the authenticity of the FGT region generated by the improved U-Net, which makes the segmentation result more stable and accurate. A dataset of 100 three-dimensional (3D) bilateral breast MRI scans from 100 patients (aged 22-78 years) was used in this study with Institutional Review Board (IRB) approval. 3D hand-segmented FGT areas for all breasts were provided as a reference standard. Five-fold cross-validation was used in training and testing of the models. The Dice similarity coefficient (DSC) and Jaccard index (JI) values were evaluated to measure the segmentation accuracy. The previous method using classical U-Net was used as a baseline in this study. In the five partitions of the cross-validation set, the GAN achieved DSC and JI values of 87.0 ± 7.0% and 77.6 ± 10.1%, respectively, while the corresponding values obtained through by the baseline method were 81.1 ± 8.7% and 69.0 ± 11.3%, respectively. The proposed method is significantly superior to the previous method using U-Net. The FGT segmentation impacted the BPE quantification application in the following manner: the correlation coefficients between the quantified BPE value and BI-RADS BPE categories provided by the radiologist were 0.46 ± 0.15 (best: 0.63) based on GAN segmented FGT areas, while the corresponding correlation coefficients were 0.41 ± 0.16 (best: 0.60) based on baseline U-Net segmented FGT areas. BPE can be quantified better using the FGT areas segmented by the proposed GAN model than using the FGT areas segmented by the baseline U-Net.

摘要

纤维腺体组织(FGT)分割是磁共振成像(MRI)中背景实质强化(BPE)定量分析的关键步骤,这对于乳腺癌风险评估很有用。在这项研究中,我们开发了一种基于生成对抗网络(GAN)的自动化深度学习方法,用于识别 MRI 容积中的 FGT 区域,并评估其对特定临床应用的影响。GAN 由改进的 U-Net 作为生成器组成,用于生成 FGT 候选区域,由补丁深度卷积神经网络(DCNN)作为鉴别器,用于评估合成 FGT 区域的真实性。与经典的 U-Net 相比,该方法有两个改进:(1)改进的 U-Net 旨在提取 FGT 区域的更多特征,以更准确地描述 FGT 区域;(2)设计了一个补丁 DCNN 来区分改进的 U-Net 生成的 FGT 区域的真实性,这使得分割结果更加稳定和准确。本研究使用了 100 名患者(年龄 22-78 岁)的 100 个三维(3D)双侧乳腺 MRI 扫描数据集,经机构审查委员会(IRB)批准。所有乳房的 3D 手动分割 FGT 区域被用作参考标准。模型的训练和测试采用五折交叉验证。使用 Dice 相似系数(DSC)和 Jaccard 指数(JI)值来衡量分割精度。本研究将使用经典 U-Net 的先前方法作为基线。在交叉验证集的五个分区中,GAN 分别达到了 87.0±7.0%和 77.6±10.1%的 DSC 和 JI 值,而基线方法的相应值分别为 81.1±8.7%和 69.0±11.3%。与使用 U-Net 的先前方法相比,该方法具有显著优势。FGT 分割以以下方式影响 BPE 定量应用:基于 GAN 分割的 FGT 区域,定量 BPE 值与放射科医师提供的 BI-RADS BPE 类别之间的相关系数为 0.46±0.15(最佳值为 0.63),而基于基线 U-Net 分割的 FGT 区域的相应相关系数为 0.41±0.16(最佳值为 0.60)。与使用基线 U-Net 分割的 FGT 区域相比,使用所提出的 GAN 模型分割的 FGT 区域可以更好地量化 BPE。

相似文献

[1]
Automated fibroglandular tissue segmentation in breast MRI using generative adversarial networks.

Phys Med Biol. 2020-5-19

[2]
Using deep learning to segment breast and fibroglandular tissue in MRI volumes.

Med Phys. 2017-2

[3]
Fully Automated Convolutional Neural Network Method for Quantification of Breast MRI Fibroglandular Tissue and Background Parenchymal Enhancement.

J Digit Imaging. 2019-2

[4]
Automatic Breast and Fibroglandular Tissue Segmentation in Breast MRI Using Deep Learning by a Fully-Convolutional Residual Neural Network U-Net.

Acad Radiol. 2019-1-31

[5]
Fully automatic quantification of fibroglandular tissue and background parenchymal enhancement with accurate implementation for axial and sagittal breast MRI protocols.

Med Phys. 2021-1

[6]
Fully Automatic Assessment of Background Parenchymal Enhancement on Breast MRI Using Machine-Learning Models.

J Magn Reson Imaging. 2021-3

[7]
Automated fibroglandular tissue segmentation and volumetric density estimation in breast MRI using an atlas-aided fuzzy C-means method.

Med Phys. 2013-12

[8]
Generalizable attention U-Net for segmentation of fibroglandular tissue and background parenchymal enhancement in breast DCE-MRI.

Insights Imaging. 2023-11-6

[9]
Development of U-Net Breast Density Segmentation Method for Fat-Sat MR Images Using Transfer Learning Based on Non-Fat-Sat Model.

J Digit Imaging. 2021-8

[10]
Amount of fibroglandular tissue FGT and background parenchymal enhancement BPE in relation to breast cancer risk and false positives in a breast MRI screening program : A retrospective cohort study.

Eur Radiol. 2019-2-22

引用本文的文献

[1]
Comparative analysis of nnU-Net and Auto3Dseg for fat and fibroglandular tissue segmentation in MRI.

J Med Imaging (Bellingham). 2025-3

[2]
Chan-Vese aided fuzzy C-means approach for whole breast and fibroglandular tissue segmentation: Preliminary application to real-world breast MRI.

Med Phys. 2025-5

[3]
New Frontiers in Breast Cancer Imaging: The Rise of AI.

Bioengineering (Basel). 2024-5-2

[4]
A publicly available deep learning model and dataset for segmentation of breast, fibroglandular tissue, and vessels in breast MRI.

Sci Rep. 2024-3-5

[5]
Generalizable attention U-Net for segmentation of fibroglandular tissue and background parenchymal enhancement in breast DCE-MRI.

Insights Imaging. 2023-11-6

[6]
Fibroglandular tissue segmentation in breast MRI using vision transformers: a multi-institutional evaluation.

Sci Rep. 2023-8-30

[7]
Improvement of semantic segmentation through transfer learning of multi-class regions with convolutional neural networks on supine and prone breast MRI images.

Sci Rep. 2023-4-27

[8]
Artificial intelligence in breast cancer imaging: risk stratification, lesion detection and classification, treatment planning and prognosis-a narrative review.

Explor Target Antitumor Ther. 2022

[9]
AI in Breast Cancer Imaging: A Survey of Different Applications.

J Imaging. 2022-8-26

[10]
Generative adversarial networks and its applications in the biomedical image segmentation: a comprehensive survey.

Int J Multimed Inf Retr. 2022

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

推荐工具

医学文档翻译智能文献检索