• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于超声影像组学的自动分割模型和机器学习模型,用于区分附件包块的低恶性风险和中高恶性风险。

Automatic segmentation model and machine learning model grounded in ultrasound radiomics for distinguishing between low malignant risk and intermediate-high malignant risk of adnexal masses.

作者信息

Liu Lu, Cai Wenjun, Zheng Feibo, Tian Hongyan, Li Yanping, Wang Ting, Chen Xiaonan, Zhu Wenjing

机构信息

Department of Ultrasound Medicine, South China Hospital, Medical School, Shenzhen University, Shenzhen, P. R. China.

Department of Ultrasound, Shenzhen University General Hospital, Medical School, Shenzhen University, Shenzhen, P. R. China.

出版信息

Insights Imaging. 2025 Jan 13;16(1):14. doi: 10.1186/s13244-024-01874-7.

DOI:10.1186/s13244-024-01874-7
PMID:39804536
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11729609/
Abstract

OBJECTIVE

To develop an automatic segmentation model to delineate the adnexal masses and construct a machine learning model to differentiate between low malignant risk and intermediate-high malignant risk of adnexal masses based on ovarian-adnexal reporting and data system (O-RADS).

METHODS

A total of 663 ultrasound images of adnexal mass were collected and divided into two sets according to experienced radiologists: a low malignant risk set (n = 446) and an intermediate-high malignant risk set (n = 217). Deep learning segmentation models were trained and selected to automatically segment adnexal masses. Radiomics features were extracted utilizing a feature analysis system in Pyradiomics. Feature selection was conducted using the Spearman correlation analysis, Mann-Whitney U-test, and least absolute shrinkage and selection operator (LASSO) regression. A nomogram integrating radiomic and clinical features using a machine learning model was established and evaluated. The SHapley Additive exPlanations were used for model interpretability and visualization.

RESULTS

The FCN ResNet101 demonstrated the highest segmentation performance for adnexal masses (Dice similarity coefficient: 89.1%). Support vector machine achieved the best AUC (0.961, 95% CI: 0.925-0.996). The nomogram using the LightGBM algorithm reached the best AUC (0.966, 95% CI: 0.927-1.000). The diagnostic performance of the nomogram was comparable to that of experienced radiologists (p > 0.05) and outperformed that of less-experienced radiologists (p < 0.05). The model significantly improved the diagnostic accuracy of less-experienced radiologists.

CONCLUSIONS

The segmentation model serves as a valuable tool for the automated delineation of adnexal lesions. The machine learning model exhibited commendable classification capability and outperformed the diagnostic performance of less-experienced radiologists.

CRITICAL RELEVANCE STATEMENT

The ultrasound radiomics-based machine learning model holds the potential to elevate the professional ability of less-experienced radiologists and can be used to assist in the clinical screening of ovarian cancer.

KEY POINTS

We developed an image segmentation model to automatically delineate adnexal masses. We developed a model to classify adnexal masses based on O-RADS. The machine learning model has achieved commendable classification performance. The machine learning model possesses the capability to enhance the proficiency of less-experienced radiologists. We used SHapley Additive exPlanations to interpret and visualize the model.

摘要

目的

开发一种自动分割模型以勾勒附件包块,并构建一种机器学习模型,基于卵巢附件报告和数据系统(O-RADS)区分附件包块的低恶性风险和中高恶性风险。

方法

收集了663张附件包块的超声图像,并根据经验丰富的放射科医生将其分为两组:低恶性风险组(n = 446)和中高恶性风险组(n = 217)。训练并选择深度学习分割模型以自动分割附件包块。利用Pyradiomics中的特征分析系统提取影像组学特征。使用Spearman相关分析、Mann-Whitney U检验和最小绝对收缩和选择算子(LASSO)回归进行特征选择。建立并评估使用机器学习模型整合影像组学和临床特征的列线图。使用SHapley加性解释进行模型解释和可视化。

结果

FCN ResNet101在附件包块分割性能方面表现最佳(骰子相似系数:89.1%)。支持向量机获得了最佳的AUC(0.961,95% CI:0.925 - 0.996)。使用LightGBM算法的列线图达到了最佳的AUC(0.966,95% CI:0.927 - 1.000)。列线图的诊断性能与经验丰富的放射科医生相当(p > 0.05),且优于经验不足的放射科医生(p < 0.05)。该模型显著提高了经验不足的放射科医生的诊断准确性。

结论

分割模型是自动勾勒附件病变的有价值工具。机器学习模型表现出值得称赞的分类能力,且优于经验不足的放射科医生的诊断性能。

关键相关声明

基于超声影像组学的机器学习模型有潜力提升经验不足的放射科医生的专业能力,并可用于辅助卵巢癌的临床筛查。

要点

我们开发了一种图像分割模型以自动勾勒附件包块。我们开发了一种基于O-RADS对附件包块进行分类的模型。机器学习模型取得了值得称赞的分类性能。机器学习模型具备提高经验不足的放射科医生熟练程度的能力。我们使用SHapley加性解释来解释和可视化模型。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/e48f466f9116/13244_2024_1874_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/b8db9b5356d3/13244_2024_1874_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/adf9c66c1c60/13244_2024_1874_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/cc99a3e080ab/13244_2024_1874_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/9a60ea918f06/13244_2024_1874_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/fefe27322752/13244_2024_1874_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/e48f466f9116/13244_2024_1874_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/b8db9b5356d3/13244_2024_1874_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/adf9c66c1c60/13244_2024_1874_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/cc99a3e080ab/13244_2024_1874_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/9a60ea918f06/13244_2024_1874_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/fefe27322752/13244_2024_1874_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6d1e/11729609/e48f466f9116/13244_2024_1874_Fig6_HTML.jpg

相似文献

1
Automatic segmentation model and machine learning model grounded in ultrasound radiomics for distinguishing between low malignant risk and intermediate-high malignant risk of adnexal masses.基于超声影像组学的自动分割模型和机器学习模型,用于区分附件包块的低恶性风险和中高恶性风险。
Insights Imaging. 2025 Jan 13;16(1):14. doi: 10.1186/s13244-024-01874-7.
2
Ultrasound image-based nomogram combining clinical, radiomics, and deep transfer learning features for automatic classification of ovarian masses according to O-RADS.基于超声图像的列线图,结合临床、影像组学和深度迁移学习特征,用于根据卵巢影像报告和数据系统(O-RADS)对卵巢肿块进行自动分类。
Front Oncol. 2024 May 15;14:1377489. doi: 10.3389/fonc.2024.1377489. eCollection 2024.
3
Radiomics analysis of ultrasound images to discriminate between benign and malignant adnexal masses with solid morphology on ultrasound.超声图像的放射组学分析用于鉴别超声检查中具有实性形态的附件区良恶性肿块。
Ultrasound Obstet Gynecol. 2025 Mar;65(3):353-363. doi: 10.1002/uog.27680. Epub 2025 Feb 2.
4
Ultrasound radiomics-based artificial intelligence model to assist in the differential diagnosis of ovarian endometrioma and ovarian dermoid cyst.基于超声影像组学的人工智能模型辅助卵巢子宫内膜异位囊肿与卵巢皮样囊肿的鉴别诊断
Front Med (Lausanne). 2024 Mar 8;11:1362588. doi: 10.3389/fmed.2024.1362588. eCollection 2024.
5
Machine learning and radiomics for segmentation and classification of adnexal masses on ultrasound.用于超声下附件包块分割与分类的机器学习和放射组学
NPJ Precis Oncol. 2024 Feb 20;8(1):41. doi: 10.1038/s41698-024-00527-8.
6
AI-based automated segmentation for ovarian/adnexal masses and their internal components on ultrasound imaging.基于人工智能的超声成像中卵巢/附件肿块及其内部成分的自动分割
J Med Imaging (Bellingham). 2024 Jul;11(4):044505. doi: 10.1117/1.JMI.11.4.044505. Epub 2024 Aug 6.
7
Multiparametric MRI-Based Interpretable Radiomics Machine Learning Model Differentiates Medulloblastoma and Ependymoma in Children: A Two-Center Study.基于多参数 MRI 的可解释放射组学机器学习模型鉴别儿童髓母细胞瘤和室管膜瘤:一项双中心研究。
Acad Radiol. 2024 Aug;31(8):3384-3396. doi: 10.1016/j.acra.2024.02.040. Epub 2024 Mar 20.
8
Diagnostic Performance of O-RADS US (Version 2019 and Version 2022) Incorporating Acoustic Shadowing by Junior Radiologists: Analyzing 1061 Adnexal Masses.初级放射科医生运用伴有声影的O-RADS US(2019版和2022版)的诊断效能:分析1061例附件包块
J Ultrasound Med. 2025 May;44(5):845-855. doi: 10.1002/jum.16644. Epub 2025 Jan 10.
9
The predictive value of nomogram for adnexal cystic-solid masses based on O-RADS US, clinical and laboratory indicators.基于 O-RADS-US、临床和实验室指标的附件囊实性肿块的列线图预测价值。
BMC Med Imaging. 2024 Nov 18;24(1):315. doi: 10.1186/s12880-024-01497-w.
10
The value of MRI in differentiating ovarian clear cell carcinoma from other adnexal masses with O-RADS MRI scores of 4-5.使用O-RADS MRI评分4-5时,MRI在鉴别卵巢透明细胞癌与其他附件肿块中的价值。
Insights Imaging. 2025 Jan 29;16(1):22. doi: 10.1186/s13244-024-01860-z.

本文引用的文献

1
PMFFNet: A hybrid network based on feature pyramid for ovarian tumor segmentation.PMFFNet:一种基于特征金字塔的混合网络用于卵巢肿瘤分割。
PLoS One. 2024 Apr 1;19(4):e0299360. doi: 10.1371/journal.pone.0299360. eCollection 2024.
2
An Empirical Evaluation of a Novel Ensemble Deep Neural Network Model and Explainable AI for Accurate Segmentation and Classification of Ovarian Tumors Using CT Images.一种新型集成深度神经网络模型与可解释人工智能用于基于CT图像的卵巢肿瘤精确分割与分类的实证评估
Diagnostics (Basel). 2024 Mar 4;14(5):543. doi: 10.3390/diagnostics14050543.
3
O-RADS US v2022: An Update from the American College of Radiology's Ovarian-Adnexal Reporting and Data System US Committee.
O-RADS US v2022:美国放射学会卵巢-附件报告和数据系统美国委员会的更新。
Radiology. 2023 Sep;308(3):e230685. doi: 10.1148/radiol.230685.
4
CT-based deep learning segmentation of ovarian cancer and the stability of the extracted radiomics features.基于CT的卵巢癌深度学习分割及提取的影像组学特征的稳定性
Quant Imaging Med Surg. 2023 Aug 1;13(8):5218-5229. doi: 10.21037/qims-22-1135. Epub 2023 Jun 13.
5
Machine learning combined with radiomics and deep learning features extracted from CT images: a novel AI model to distinguish benign from malignant ovarian tumors.机器学习结合从CT图像中提取的放射组学和深度学习特征:一种区分卵巢良性和恶性肿瘤的新型人工智能模型。
Insights Imaging. 2023 Apr 24;14(1):68. doi: 10.1186/s13244-023-01412-x.
6
Systematic Review and Meta-Analysis of O-RADS Ultrasound and O-RADS MRI for Risk Assessment of Ovarian and Adnexal Lesions.O-RADS 超声和 O-RADS MRI 用于卵巢和附件病变风险评估的系统评价和荟萃分析。
AJR Am J Roentgenol. 2023 Jul;221(1):21-33. doi: 10.2214/AJR.22.28396. Epub 2023 Feb 1.
7
Artificial intelligence performance in image-based ovarian cancer identification: A systematic review and meta-analysis.基于图像的卵巢癌识别中的人工智能性能:一项系统评价和荟萃分析。
EClinicalMedicine. 2022 Sep 17;53:101662. doi: 10.1016/j.eclinm.2022.101662. eCollection 2022 Nov.
8
Ovarian Adnexal Reporting Data System (O-RADS) for Classifying Adnexal Masses: A Systematic Review and Meta-Analysis.用于附件包块分类的卵巢附件报告数据系统(O-RADS):一项系统评价和Meta分析
Cancers (Basel). 2022 Jun 27;14(13):3151. doi: 10.3390/cancers14133151.
9
Diagnostic Performance of the Ovarian-Adnexal Reporting and Data System (O-RADS) Ultrasound Risk Score in Women in the United States.美国女性卵巢-附件报告和数据系统(O-RADS)超声风险评分的诊断性能。
JAMA Netw Open. 2022 Jun 1;5(6):e2216370. doi: 10.1001/jamanetworkopen.2022.16370.
10
The radiomic-clinical model using the SHAP method for assessing the treatment response of whole-brain radiotherapy: a multicentric study.基于 SHAP 方法的放射组学-临床模型评估全脑放疗的治疗反应:一项多中心研究。
Eur Radiol. 2022 Dec;32(12):8737-8747. doi: 10.1007/s00330-022-08887-0. Epub 2022 Jun 9.