• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

2S-BUSGAN:一种基于小数据集的具有真实乳房超声图像和对应肿瘤轮廓的新型生成对抗网络。

2S-BUSGAN: A Novel Generative Adversarial Network for Realistic Breast Ultrasound Image with Corresponding Tumor Contour Based on Small Datasets.

机构信息

College of Biomedical Engineering, Sichuan University, Chengdu 610065, China.

Department of Ultrasound, West China Hospital, Sichuan University, Chengdu 610065, China.

出版信息

Sensors (Basel). 2023 Oct 20;23(20):8614. doi: 10.3390/s23208614.

DOI:10.3390/s23208614
PMID:37896706
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10610581/
Abstract

Deep learning (DL) models in breast ultrasound (BUS) image analysis face challenges with data imbalance and limited atypical tumor samples. Generative Adversarial Networks (GAN) address these challenges by providing efficient data augmentation for small datasets. However, current GAN approaches fail to capture the structural features of BUS and generated images lack structural legitimacy and are unrealistic. Furthermore, generated images require manual annotation for different downstream tasks before they can be used. Therefore, we propose a two-stage GAN framework, 2s-BUSGAN, for generating annotated BUS images. It consists of the Mask Generation Stage (MGS) and the Image Generation Stage (IGS), generating benign and malignant BUS images using corresponding tumor contours. Moreover, we employ a Feature-Matching Loss (FML) to enhance the quality of generated images and utilize a Differential Augmentation Module (DAM) to improve GAN performance on small datasets. We conduct experiments on two datasets, BUSI and Collected. Moreover, results indicate that the quality of generated images is improved compared with traditional GAN methods. Additionally, our generated images underwent evaluation by ultrasound experts, demonstrating the possibility of deceiving doctors. A comparative evaluation showed that our method also outperforms traditional GAN methods when applied to training segmentation and classification models. Our method achieved a classification accuracy of 69% and 85.7% on two datasets, respectively, which is about 3% and 2% higher than that of the traditional augmentation model. The segmentation model trained using the 2s-BUSGAN augmented datasets achieved DICE scores of 75% and 73% on the two datasets, respectively, which were higher than the traditional augmentation methods. Our research tackles imbalanced and limited BUS image data challenges. Our 2s-BUSGAN augmentation method holds potential for enhancing deep learning model performance in the field.

摘要

深度学习(DL)模型在乳腺超声(BUS)图像分析中面临数据不平衡和典型肿瘤样本有限的挑战。生成对抗网络(GAN)通过为小数据集提供有效的数据扩充来解决这些挑战。然而,目前的 GAN 方法无法捕捉 BUS 的结构特征,生成的图像缺乏结构合法性且不真实。此外,生成的图像在用于不同的下游任务之前需要进行手动注释。因此,我们提出了一种两阶段 GAN 框架 2s-BUSGAN,用于生成带注释的 BUS 图像。它由掩模生成阶段(MGS)和图像生成阶段(IGS)组成,使用相应的肿瘤轮廓生成良性和恶性 BUS 图像。此外,我们采用特征匹配损失(FML)来提高生成图像的质量,并利用差分增强模块(DAM)来提高 GAN 在小数据集上的性能。我们在两个数据集 BUSI 和 Collected 上进行了实验。结果表明,与传统 GAN 方法相比,生成图像的质量得到了提高。此外,我们的生成图像还经过了超声专家的评估,表明有欺骗医生的可能性。对比评估表明,我们的方法应用于训练分割和分类模型时也优于传统 GAN 方法。我们的方法在两个数据集上的分类准确率分别为 69%和 85.7%,比传统的增强模型分别高约 3%和 2%。使用 2s-BUSGAN 增强数据集训练的分割模型在两个数据集上的 DICE 得分分别为 75%和 73%,高于传统的增强方法。我们的研究解决了不平衡和有限的 BUS 图像数据挑战。我们的 2s-BUSGAN 增强方法有望提高深度学习模型在该领域的性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/7c3605c83a2e/sensors-23-08614-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/473d62eb858b/sensors-23-08614-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/ca8722a880a3/sensors-23-08614-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/4605aebb49f8/sensors-23-08614-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/cd89b5f399ab/sensors-23-08614-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/c7f9986a59f1/sensors-23-08614-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/a17a2021dee7/sensors-23-08614-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/d5bad1a2a99f/sensors-23-08614-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/3ce648637e42/sensors-23-08614-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/0dc1a448b099/sensors-23-08614-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/2fa60d008216/sensors-23-08614-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/7c3605c83a2e/sensors-23-08614-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/473d62eb858b/sensors-23-08614-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/ca8722a880a3/sensors-23-08614-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/4605aebb49f8/sensors-23-08614-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/cd89b5f399ab/sensors-23-08614-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/c7f9986a59f1/sensors-23-08614-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/a17a2021dee7/sensors-23-08614-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/d5bad1a2a99f/sensors-23-08614-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/3ce648637e42/sensors-23-08614-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/0dc1a448b099/sensors-23-08614-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/2fa60d008216/sensors-23-08614-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bbce/10610581/7c3605c83a2e/sensors-23-08614-g011.jpg

相似文献

1
2S-BUSGAN: A Novel Generative Adversarial Network for Realistic Breast Ultrasound Image with Corresponding Tumor Contour Based on Small Datasets.2S-BUSGAN:一种基于小数据集的具有真实乳房超声图像和对应肿瘤轮廓的新型生成对抗网络。
Sensors (Basel). 2023 Oct 20;23(20):8614. doi: 10.3390/s23208614.
2
Semi-supervised segmentation of lesion from breast ultrasound images with attentional generative adversarial network.基于注意力生成对抗网络的乳腺超声图像病灶半监督分割。
Comput Methods Programs Biomed. 2020 Jun;189:105275. doi: 10.1016/j.cmpb.2019.105275. Epub 2019 Dec 12.
3
A medical image classification method based on self-regularized adversarial learning.基于自正则化对抗学习的医学图像分类方法。
Med Phys. 2024 Nov;51(11):8232-8246. doi: 10.1002/mp.17320. Epub 2024 Jul 30.
4
MSF-GAN: Multi-Scale Fuzzy Generative Adversarial Network for Breast Ultrasound Image Segmentation.MSF-GAN:用于乳腺超声图像分割的多尺度模糊生成对抗网络。
Annu Int Conf IEEE Eng Med Biol Soc. 2021 Nov;2021:3193-3196. doi: 10.1109/EMBC46164.2021.9630108.
5
Cross-domain attention-guided generative data augmentation for medical image analysis with limited data.基于跨领域注意力引导的生成式数据扩充方法,可用于有限数据条件下的医学图像分析。
Comput Biol Med. 2024 Jan;168:107744. doi: 10.1016/j.compbiomed.2023.107744. Epub 2023 Nov 23.
6
Semi-supervised GAN-based Radiomics Model for Data Augmentation in Breast Ultrasound Mass Classification.基于半监督生成对抗网络的乳腺超声肿块分类数据增强放射组学模型
Comput Methods Programs Biomed. 2021 May;203:106018. doi: 10.1016/j.cmpb.2021.106018. Epub 2021 Feb 27.
7
SpeckleGAN: a generative adversarial network with an adaptive speckle layer to augment limited training data for ultrasound image processing.SpeckleGAN:一种具有自适应散斑层的生成对抗网络,用于扩充有限的超声图像处理训练数据。
Int J Comput Assist Radiol Surg. 2020 Sep;15(9):1427-1436. doi: 10.1007/s11548-020-02203-1. Epub 2020 Jun 18.
8
GSDA: Generative adversarial network-based semi-supervised data augmentation for ultrasound image classification.GSDA:基于生成对抗网络的半监督数据增强用于超声图像分类
Heliyon. 2023 Sep 4;9(9):e19585. doi: 10.1016/j.heliyon.2023.e19585. eCollection 2023 Sep.
9
A cGAN-based tumor segmentation method for breast ultrasound images.基于条件生成对抗网络的乳腺超声图像肿瘤分割方法。
Phys Med Biol. 2023 Jun 21;68(13). doi: 10.1088/1361-6560/acdbb4.
10
High resolution histopathology image generation and segmentation through adversarial training.通过对抗训练生成和分割高分辨率组织病理学图像。
Med Image Anal. 2022 Jan;75:102251. doi: 10.1016/j.media.2021.102251. Epub 2021 Nov 3.

引用本文的文献

1
Enhancing Lesion Segmentation in Ultrasound Images: The Impact of Targeted Data Augmentation Strategies.增强超声图像中的病变分割:靶向数据增强策略的影响
Int J Biomed Imaging. 2025 Aug 11;2025:3309822. doi: 10.1155/ijbi/3309822. eCollection 2025.
2
Automatic joint segmentation and classification of breast ultrasound images via multi-task learning with object contextual attention.通过具有对象上下文注意力的多任务学习实现乳腺超声图像的自动关节分割与分类。
Front Oncol. 2025 Apr 8;15:1567577. doi: 10.3389/fonc.2025.1567577. eCollection 2025.

本文引用的文献

1
Cancer statistics in China and United States, 2022: profiles, trends, and determinants.中国和美国 2022 年癌症统计数据:概况、趋势和决定因素。
Chin Med J (Engl). 2022 Feb 9;135(5):584-590. doi: 10.1097/CM9.0000000000002108.
2
A Progressive Generative Adversarial Method for Structurally Inadequate Medical Image Data Augmentation.一种用于结构不足的医学图像数据增强的渐进式生成对抗方法。
IEEE J Biomed Health Inform. 2022 Jan;26(1):7-16. doi: 10.1109/JBHI.2021.3101551. Epub 2022 Jan 17.
3
Semi-supervised GAN-based Radiomics Model for Data Augmentation in Breast Ultrasound Mass Classification.
基于半监督生成对抗网络的乳腺超声肿块分类数据增强放射组学模型
Comput Methods Programs Biomed. 2021 May;203:106018. doi: 10.1016/j.cmpb.2021.106018. Epub 2021 Feb 27.
4
Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries.《全球癌症统计数据 2020:全球 185 个国家和地区 36 种癌症的发病率和死亡率估计》。
CA Cancer J Clin. 2021 May;71(3):209-249. doi: 10.3322/caac.21660. Epub 2021 Feb 4.
5
Breaking medical data sharing boundaries by using synthesized radiographs.利用合成X线照片突破医学数据共享的界限。
Sci Adv. 2020 Dec 2;6(49). doi: 10.1126/sciadv.abb7973. Print 2020 Dec.
6
Breast ultrasound region of interest detection and lesion localisation.乳腺超声感兴趣区域检测和病灶定位。
Artif Intell Med. 2020 Jul;107:101880. doi: 10.1016/j.artmed.2020.101880. Epub 2020 May 29.
7
Semi-supervised segmentation of lesion from breast ultrasound images with attentional generative adversarial network.基于注意力生成对抗网络的乳腺超声图像病灶半监督分割。
Comput Methods Programs Biomed. 2020 Jun;189:105275. doi: 10.1016/j.cmpb.2019.105275. Epub 2019 Dec 12.
8
Dataset of breast ultrasound images.乳腺超声图像数据集。
Data Brief. 2019 Nov 21;28:104863. doi: 10.1016/j.dib.2019.104863. eCollection 2020 Feb.