Suppr超能文献

基于测试时增强的深度学习用于经支气管径向超声图像鉴别:一项多中心验证研究。

Deep learning with test-time augmentation for radial endobronchial ultrasound image differentiation: a multicentre verification study.

机构信息

Department of Internal Medicine, National Taiwan University Hospital Hsin-Chu Branch, Hsinchu, Taiwan.

Graduate Institute of Clinical Medicine, National Taiwan University College of Medicine, Taipei, Taiwan.

出版信息

BMJ Open Respir Res. 2023 Aug;10(1). doi: 10.1136/bmjresp-2022-001602.

Abstract

PURPOSE

Despite the importance of radial endobronchial ultrasound (rEBUS) in transbronchial biopsy, researchers have yet to apply artificial intelligence to the analysis of rEBUS images.

MATERIALS AND METHODS

This study developed a convolutional neural network (CNN) to differentiate between malignant and benign tumours in rEBUS images. This study retrospectively collected rEBUS images from medical centres in Taiwan, including 769 from National Taiwan University Hospital Hsin-Chu Branch, Hsinchu Hospital for model training (615 images) and internal validation (154 images) as well as 300 from National Taiwan University Hospital (NTUH-TPE) and 92 images were obtained from National Taiwan University Hospital Hsin-Chu Branch, Biomedical Park Hospital (NTUH-BIO) for external validation. Further assessments of the model were performed using image augmentation in the training phase and test-time augmentation (TTA).

RESULTS

Using the internal validation dataset, the results were as follows: area under the curve (AUC) (0.88 (95% CI 0.83 to 0.92)), sensitivity (0.80 (95% CI 0.73 to 0.88)), specificity (0.75 (95% CI 0.66 to 0.83)). Using the NTUH-TPE external validation dataset, the results were as follows: AUC (0.76 (95% CI 0.71 to 0.80)), sensitivity (0.58 (95% CI 0.50 to 0.65)), specificity (0.92 (95% CI 0.88 to 0.97)). Using the NTUH-BIO external validation dataset, the results were as follows: AUC (0.72 (95% CI 0.64 to 0.82)), sensitivity (0.71 (95% CI 0.55 to 0.86)), specificity (0.76 (95% CI 0.64 to 0.87)). After fine-tuning, the AUC values for the external validation cohorts were as follows: NTUH-TPE (0.78) and NTUH-BIO (0.82). Our findings also demonstrated the feasibility of the model in differentiating between lung cancer subtypes, as indicated by the following AUC values: adenocarcinoma (0.70; 95% CI 0.64 to 0.76), squamous cell carcinoma (0.64; 95% CI 0.54 to 0.74) and small cell lung cancer (0.52; 95% CI 0.32 to 0.72).

CONCLUSIONS

Our results demonstrate the feasibility of the proposed CNN-based algorithm in differentiating between malignant and benign lesions in rEBUS images.

摘要

目的

尽管径向支气管内超声(rEBUS)在经支气管活检中非常重要,但研究人员尚未将人工智能应用于 rEBUS 图像分析。

材料和方法

本研究开发了一种卷积神经网络(CNN),用于区分 rEBUS 图像中的恶性和良性肿瘤。本研究回顾性地从台湾的医疗中心收集了 rEBUS 图像,包括来自国立台湾大学医院新竹分院的 769 张图像,用于模型训练(615 张图像)和内部验证(154 张图像),以及来自国立台湾大学医院的 300 张图像和来自国立台湾大学医院新竹分院生物医学园区医院的 92 张图像用于外部验证。在训练阶段和测试时间增强(TTA)中使用图像增强进一步评估模型。

结果

使用内部验证数据集,结果如下:曲线下面积(AUC)(0.88(95%CI 0.83 至 0.92))、敏感性(0.80(95%CI 0.73 至 0.88))、特异性(0.75(95%CI 0.66 至 0.83))。使用 NTUH-TPE 外部验证数据集,结果如下:AUC(0.76(95%CI 0.71 至 0.80))、敏感性(0.58(95%CI 0.50 至 0.65))、特异性(0.92(95%CI 0.88 至 0.97))。使用 NTUH-BIO 外部验证数据集,结果如下:AUC(0.72(95%CI 0.64 至 0.82))、敏感性(0.71(95%CI 0.55 至 0.86))、特异性(0.76(95%CI 0.64 至 0.87))。经过微调,外部验证队列的 AUC 值如下:NTUH-TPE(0.78)和 NTUH-BIO(0.82)。我们的研究结果还表明,该模型在区分肺癌亚型方面具有可行性,如下所示 AUC 值:腺癌(0.70;95%CI 0.64 至 0.76)、鳞状细胞癌(0.64;95%CI 0.54 至 0.74)和小细胞肺癌(0.52;95%CI 0.32 至 0.72)。

结论

我们的研究结果表明,基于 CNN 的算法在区分 rEBUS 图像中的恶性和良性病变方面具有可行性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/433b/10401203/94d57ee93e5c/bmjresp-2022-001602f01.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验