• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用CT图像进行良性和恶性磨玻璃结节纵向预测的深度学习模型。

Deep learning model using CT images for longitudinal prediction of benign and malignant ground-glass nodules.

作者信息

Yang Xiaolong, Wang Jiayang, Wang Ping, Li Yingjie, Wen Zhubin, Shang Jiming, Chen Kaige, Tang Chao, Liang Shuang, Meng Wei

机构信息

Department of Radiology, Harbin Medical University, Harbin Medical University Cancer Hospital, 150 Haping Road, Harbin, Heilongjiang 150081, China.

Department of Radiology, Beijing Jishuitan Hospital, Capital Medical University, Beijing 100035, China.

出版信息

Eur J Radiol. 2025 Sep;190:112252. doi: 10.1016/j.ejrad.2025.112252. Epub 2025 Jun 18.

DOI:10.1016/j.ejrad.2025.112252
PMID:40544718
Abstract

OBJECTIVES

To develop and validate a CT image-based multiple time-series deep learning model for the longitudinal prediction of benign and malignant pulmonary ground-glass nodules (GGNs).

METHODS

A total of 486 GGNs from an equal number of patients were included in this research, which took place at two medical centers. Each nodule underwent surgical removal and was confirmed pathologically. The patients were randomly assigned to a training set, validation set, and test set, following a distribution ratio of 7:2:1. We established a transformer-based deep learning framework that leverages multi-temporal CT images for the longitudinal prediction of GGNs, focusing on distinguishing between benign and malignant types. Additionally, we utilized 13 different machine learning algorithms to formulate clinical models, delta-radiomics models, and combined models that merge deep learning with CT semantic features. The predictive capabilities of the models were assessed using the receiver operating characteristic (ROC) curve and the area under the curve (AUC).

RESULTS

The multiple time-series deep learning model based on CT images surpassed both the clinical model and the delta-radiomics model, showcasing strong predictive capabilities for GGNs across the training, validation, and test sets, with AUCs of 0.911 (95% CI, 0.879-0.939), 0.809 (95% CI,0.715-0.908), and 0.817 (95% CI,0.680-0.937), respectively. Furthermore, the models that integrated deep learning with CT semantic features achieved the highest performance, resulting in AUCs of 0.960 (95% CI, 0.912-0.977), 0.878 (95% CI,0.801-0.942), and 0.890(95% CI, 0.790-0.968).

CONCLUSION

The multiple time-series deep learning model utilizing CT images was effective in predicting benign and malignant GGNs.

摘要

目的

开发并验证一种基于CT图像的多时间序列深度学习模型,用于对良性和恶性肺磨玻璃结节(GGN)进行纵向预测。

方法

本研究纳入了来自两个医疗中心的同等数量患者的486个GGN。每个结节均接受手术切除并经病理证实。患者按照7:2:1的分配比例随机分为训练集、验证集和测试集。我们建立了一个基于Transformer的深度学习框架,该框架利用多期CT图像对GGN进行纵向预测,重点是区分良性和恶性类型。此外,我们使用13种不同的机器学习算法构建临床模型、增量放射组学模型以及将深度学习与CT语义特征相结合的联合模型。使用受试者操作特征(ROC)曲线和曲线下面积(AUC)评估模型的预测能力。

结果

基于CT图像的多时间序列深度学习模型优于临床模型和增量放射组学模型,在训练集、验证集和测试集中对GGN均展现出强大的预测能力,其AUC分别为0.911(95%CI,0.879 - 0.939)、0.809(95%CI,0.715 - 0.908)和0.817(95%CI,0.680 - 0.937)。此外,将深度学习与CT语义特征相结合的模型性能最佳,其AUC分别为0.960(95%CI,0.912 - 0.977)、0.878(95%CI,0.801 - 0.942)和0.890(95%CI,0.790 - 0.968)。

结论

利用CT图像的多时间序列深度学习模型在预测良性和恶性GGN方面是有效的。

相似文献

1
Deep learning model using CT images for longitudinal prediction of benign and malignant ground-glass nodules.使用CT图像进行良性和恶性磨玻璃结节纵向预测的深度学习模型。
Eur J Radiol. 2025 Sep;190:112252. doi: 10.1016/j.ejrad.2025.112252. Epub 2025 Jun 18.
2
Development and Validation of a Convolutional Neural Network Model to Predict a Pathologic Fracture in the Proximal Femur Using Abdomen and Pelvis CT Images of Patients With Advanced Cancer.利用晚期癌症患者腹部和骨盆 CT 图像建立卷积神经网络模型预测股骨近端病理性骨折的研究
Clin Orthop Relat Res. 2023 Nov 1;481(11):2247-2256. doi: 10.1097/CORR.0000000000002771. Epub 2023 Aug 23.
3
Development and interpretation of machine learning-based prognostic models for predicting high-risk prognostic pathological components in pulmonary nodules: integrating clinical features, serum tumor marker and imaging features.基于机器学习的预测肺结节高危预后病理成分的预后模型的开发与解读:整合临床特征、血清肿瘤标志物和影像特征
J Cancer Res Clin Oncol. 2025 Jun 17;151(6):190. doi: 10.1007/s00432-025-06241-7.
4
Deep transfer learning radiomics combined with explainable machine learning for preoperative thymoma risk prediction based on CT.基于CT的深度迁移学习放射组学联合可解释机器学习用于术前胸腺瘤风险预测
Eur J Radiol. 2025 Sep;190:112266. doi: 10.1016/j.ejrad.2025.112266. Epub 2025 Jun 26.
5
Radiomic 'Stress Test': exploration of a deep learning radiomic model in a high-risk prospective lung nodule cohort.放射组学“压力测试”:在高危前瞻性肺结节队列中对深度学习放射组学模型的探索
BMJ Open Respir Res. 2025 Jun 27;12(1):e002687. doi: 10.1136/bmjresp-2024-002687.
6
Predicting brain metastases in EGFR-positive lung adenocarcinoma patients using pre-treatment CT lung imaging data.利用治疗前胸部CT影像数据预测EGFR阳性肺腺癌患者的脑转移
Eur J Radiol. 2025 Sep;190:112265. doi: 10.1016/j.ejrad.2025.112265. Epub 2025 Jun 26.
7
A systematic review on feature extraction methods and deep learning models for detection of cancerous lung nodules at an early stage -the recent trends and challenges.基于特征提取方法和深度学习模型的早期肺癌结节检测的系统评价——最新趋势和挑战。
Biomed Phys Eng Express. 2024 Nov 20;11(1). doi: 10.1088/2057-1976/ad9154.
8
Dual-energy CT Radiomics Combined with Quantitative Parameters for Differentiating Lung Adenocarcinoma From Squamous Cell Carcinoma: A Dual-center Study.双能量CT影像组学联合定量参数鉴别肺腺癌与肺鳞癌:一项双中心研究
Acad Radiol. 2025 Mar;32(3):1675-1684. doi: 10.1016/j.acra.2024.09.024. Epub 2024 Sep 25.
9
Establishing an AI-based diagnostic framework for pulmonary nodules in computed tomography.建立基于人工智能的计算机断层扫描肺结节诊断框架。
BMC Pulm Med. 2025 Jul 12;25(1):339. doi: 10.1186/s12890-025-03806-7.
10
2.5D deep learning radiomics and clinical data for predicting occult lymph node metastasis in lung adenocarcinoma.用于预测肺腺癌隐匿性淋巴结转移的2.5D深度学习影像组学和临床数据
BMC Med Imaging. 2025 Jul 1;25(1):225. doi: 10.1186/s12880-025-01759-1.