• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于深度卷积神经网络的卵巢附件病变超声图像自动O-RADS分类研究

A Study on Automatic O-RADS Classification of Sonograms of Ovarian Adnexal Lesions Based on Deep Convolutional Neural Networks.

作者信息

Liu Tao, Miao Kuo, Tan Gaoqiang, Bu Hanqi, Shao Xiaohui, Wang Siming, Dong Xiaoqiu

机构信息

The Department of Ultrasound Medicine, Harbin Medical University Fourth Affiliated Hospital, Harbin, Heilongjiang, China.

The Department of Ultrasound Medicine, Harbin Medical University Fourth Affiliated Hospital, Harbin, Heilongjiang, China.

出版信息

Ultrasound Med Biol. 2025 Feb;51(2):387-395. doi: 10.1016/j.ultrasmedbio.2024.11.009. Epub 2024 Nov 26.

DOI:10.1016/j.ultrasmedbio.2024.11.009
PMID:39603844
Abstract

OBJECTIVE

This study explored a new method for automatic O-RADS classification of sonograms based on a deep convolutional neural network (DCNN).

METHODS

A development dataset (DD) of 2,455 2D grayscale sonograms of 870 ovarian adnexal lesions and an intertemporal validation dataset (IVD) of 426 sonograms of 280 lesions were collected and classified according to O-RADS v2022 (categories 2-5) by three senior sonographers. Classification results verified by a two-tailed z-test to be consistent with the O-RADS v2022 malignancy rate indicated the diagnostic performance was comparable to that of a previous study and were used for training; otherwise, the classification was repeated by two different sonographers. The DD was used to develop three DCNN models (ResNet34, DenseNet121, and ConvNeXt-Tiny) that employed transfer learning techniques. Model performance was assessed for accuracy, precision, and F1 score, among others. The optimal model was selected and validated over time using the IVD and to analyze whether the efficiency of O-RADS classification was improved with the assistance of this model for three sonographers with different years of experience.

RESULTS

The proportion of malignant tumors in the DD and IVD in each O-RADS-defined risk category was verified using a two-tailed z-test. Malignant lesions (O-RADS categories 4 and 5) were diagnosed in the DD and IVD with sensitivities of 0.949 and 0.962 and specificities of 0.892 and 0.842, respectively. ResNet34, DenseNet121, and ConvNeXt-Tiny had overall accuracies of 0.737, 0.752, and 0.878, respectively, for sonogram prediction in the DD. The ConvNeXt-Tiny model's accuracy for sonogram prediction in the IVD was 0.859, with no significant difference between test sets. The modeling aid significantly reduced O-RADS classification time for three sonographers (Cohen's d = 5.75).

CONCLUSION

ConvNeXt-Tiny showed robust and stable performance in classifying O-RADS 2-5, improving sonologists' classification efficacy.

摘要

目的

本研究探索了一种基于深度卷积神经网络(DCNN)的超声图像自动O-RADS分类新方法。

方法

收集了870个卵巢附件病变的2455幅二维灰度超声图像的开发数据集(DD)和280个病变的426幅超声图像的跨期验证数据集(IVD),并由三位资深超声医师根据O-RADS v2022(2-5类)进行分类。经双尾z检验验证的与O-RADS v2022恶性率一致的分类结果表明诊断性能与先前研究相当,并用于训练;否则,由另外两位不同的超声医师重复分类。DD用于开发采用迁移学习技术的三种DCNN模型(ResNet34、DenseNet121和ConvNeXt-Tiny)。评估了模型在准确性、精确性和F1分数等方面的性能。选择最佳模型并使用IVD进行长期验证,并分析该模型对三位具有不同经验年限的超声医师的帮助下O-RADS分类效率是否提高。

结果

使用双尾z检验验证了每个O-RADS定义的风险类别中DD和IVD中恶性肿瘤的比例。在DD和IVD中诊断出恶性病变(O-RADS 4类和5类),敏感性分别为0.949和0.962,特异性分别为0.892和0.842。对于DD中的超声图像预测,ResNet34、DenseNet121和ConvNeXt-Tiny的总体准确率分别为0.737、0.752和0.878。ConvNeXt-Tiny模型在IVD中对超声图像预测的准确率为0.859,测试集之间无显著差异。建模辅助显著减少了三位超声医师的O-RADS分类时间(科恩d=5.75)。

结论

ConvNeXt-Tiny在对O-RADS 2-5进行分类时表现出强大且稳定的性能,提高了超声科医生的分类效率。

相似文献

1
A Study on Automatic O-RADS Classification of Sonograms of Ovarian Adnexal Lesions Based on Deep Convolutional Neural Networks.基于深度卷积神经网络的卵巢附件病变超声图像自动O-RADS分类研究
Ultrasound Med Biol. 2025 Feb;51(2):387-395. doi: 10.1016/j.ultrasmedbio.2024.11.009. Epub 2024 Nov 26.
2
Exploratory study on the enhancement of O-RADS application effectiveness for novice ultrasonographers via deep learning.通过深度学习提高新手超声检查医师O-RADS应用效果的探索性研究
Arch Gynecol Obstet. 2024 Dec;310(6):3111-3120. doi: 10.1007/s00404-024-07837-z. Epub 2024 Nov 23.
3
The Ovarian-Adnexal Reporting and Data System (O-RADS) US Score Effect on Surgical Resection Rate.卵巢-附件报告和数据系统(O-RADS)美国评分对手术切除率的影响。
Radiology. 2024 Oct;313(1):e240044. doi: 10.1148/radiol.240044.
4
Diagnostic Performance of O-RADS US (Version 2019 and Version 2022) Incorporating Acoustic Shadowing by Junior Radiologists: Analyzing 1061 Adnexal Masses.初级放射科医生运用伴有声影的O-RADS US(2019版和2022版)的诊断效能:分析1061例附件包块
J Ultrasound Med. 2025 May;44(5):845-855. doi: 10.1002/jum.16644. Epub 2025 Jan 10.
5
Diagnostic accuracy of ultrasound classifications - O-RADS US v2022, O-RADS US v2020, and IOTA SR - in distinguishing benign and malignant adnexal masses: Enhanced by combining O-RADS US v2022 with tumor marker HE4.超声分类的诊断准确性——O-RADS US v2022、O-RADS US v2020和IOTA SR——在鉴别附件肿块的良恶性方面:通过将O-RADS US v2022与肿瘤标志物HE4相结合得到增强。
Eur J Radiol. 2024 Dec;181:111824. doi: 10.1016/j.ejrad.2024.111824. Epub 2024 Nov 9.
6
Validation of American College of Radiology Ovarian-Adnexal Reporting and Data System Ultrasound (O-RADS US): Analysis on 1054 adnexal masses.美国放射学院卵巢-附件报告和数据系统超声(O-RADS US)的验证:对 1054 个附件肿块的分析。
Gynecol Oncol. 2021 Jul;162(1):107-112. doi: 10.1016/j.ygyno.2021.04.031. Epub 2021 May 7.
7
Automatic classification of ultrasound breast lesions using a deep convolutional neural network mimicking human decision-making.使用模仿人类决策的深度卷积神经网络对超声乳腺病变进行自动分类。
Eur Radiol. 2019 Oct;29(10):5458-5468. doi: 10.1007/s00330-019-06118-7. Epub 2019 Mar 29.
8
Diagnostic value of the ovarian adnexal reporting and data system ultrasound in ovarian masses: a 2-center study.卵巢附件报告和数据系统超声对卵巢肿块的诊断价值:一项双中心研究。
Br J Radiol. 2025 Mar 1;98(1167):448-457. doi: 10.1093/bjr/tqae247.
9
Accuracy of Large Language Model-based Automatic Calculation of Ovarian-Adnexal Reporting and Data System MRI Scores from Pelvic MRI Reports.基于大语言模型从盆腔MRI报告自动计算卵巢附件报告和数据系统MRI评分的准确性
Radiology. 2025 Apr;315(1):e241554. doi: 10.1148/radiol.241554.
10
Diagnostic Performance of Ultrasonography-Based Risk Models in Differentiating Between Benign and Malignant Ovarian Tumors in a US Cohort.基于超声的风险模型在区分美国队列中良性和恶性卵巢肿瘤的诊断性能。
JAMA Netw Open. 2023 Jul 3;6(7):e2323289. doi: 10.1001/jamanetworkopen.2023.23289.

引用本文的文献

1
Artificial Intelligence in Relation to Accurate Information and Tasks in Gynecologic Oncology and Clinical Medicine-Dunning-Kruger Effects and Ultracrepidarianism.人工智能与妇科肿瘤学和临床医学中的准确信息及任务——邓宁-克鲁格效应和不懂装懂。
Diagnostics (Basel). 2025 Mar 15;15(6):735. doi: 10.3390/diagnostics15060735.