Suppr超能文献

用于鉴别肾实性肿瘤良恶性的多模态超声融合网络

Multimodal ultrasound fusion network for differentiating between benign and malignant solid renal tumors.

作者信息

Zhu Dongmei, Li Junyu, Li Yan, Wu Ji, Zhu Lin, Li Jian, Wang Zimo, Xu Jinfeng, Dong Fajin, Cheng Jun

机构信息

Department of Ultrasound, The Second Clinical Medical College, Jinan University, Shenzhen, China.

Department of Ultrasound, The Affiliated Nanchong Central Hospital of North Sichuan Medical College, Nanchong, China.

出版信息

Front Mol Biosci. 2022 Sep 6;9:982703. doi: 10.3389/fmolb.2022.982703. eCollection 2022.

Abstract

We aim to establish a deep learning model called multimodal ultrasound fusion network (MUF-Net) based on gray-scale and contrast-enhanced ultrasound (CEUS) images for classifying benign and malignant solid renal tumors automatically and to compare the model's performance with the assessments by radiologists with different levels of experience. A retrospective study included the CEUS videos of 181 patients with solid renal tumors (81 benign and 100 malignant tumors) from June 2012 to June 2021. A total of 9794 B-mode and CEUS-mode images were cropped from the CEUS videos. The MUF-Net was proposed to combine gray-scale and CEUS images to differentiate benign and malignant solid renal tumors. In this network, two independent branches were designed to extract features from each of the two modalities, and the features were fused using adaptive weights. Finally, the network output a classification score based on the fused features. The model's performance was evaluated using five-fold cross-validation and compared with the assessments of the two groups of radiologists with different levels of experience. For the discrimination between benign and malignant solid renal tumors, the junior radiologist group, senior radiologist group, and MUF-Net achieved accuracy of 70.6%, 75.7%, and 80.0%, sensitivity of 89.3%, 95.9%, and 80.4%, specificity of 58.7%, 62.9%, and 79.1%, and area under the receiver operating characteristic curve of 0.740 (95% confidence internal (CI): 0.70-0.75), 0.794 (95% CI: 0.72-0.83), and 0.877 (95% CI: 0.83-0.93), respectively. The MUF-Net model can accurately classify benign and malignant solid renal tumors and achieve better performance than senior radiologists. The CEUS video data contain the entire tumor microcirculation perfusion characteristics. The proposed MUF-Net based on B-mode and CEUS-mode images can accurately distinguish between benign and malignant solid renal tumors with an area under the receiver operating characteristic curve of 0.877, which surpasses senior radiologists' assessments by a large margin.

摘要

我们旨在基于灰度和超声造影(CEUS)图像建立一种名为多模态超声融合网络(MUF-Net)的深度学习模型,用于自动分类肾实性良恶性肿瘤,并将该模型的性能与不同经验水平的放射科医生的评估结果进行比较。一项回顾性研究纳入了2012年6月至2021年6月期间181例肾实性肿瘤患者(81例良性肿瘤和100例恶性肿瘤)的CEUS视频。从CEUS视频中总共裁剪出9794张B模式和CEUS模式图像。提出MUF-Net以结合灰度和CEUS图像来区分肾实性良恶性肿瘤。在该网络中,设计了两个独立分支从两种模态中的每一种提取特征,并使用自适应权重融合这些特征。最后,网络根据融合后的特征输出分类分数。使用五折交叉验证评估模型的性能,并与两组不同经验水平的放射科医生的评估结果进行比较。对于肾实性良恶性肿瘤的鉴别,初级放射科医生组、高级放射科医生组和MUF-Net的准确率分别为70.6%、75.7%和80.0%,灵敏度分别为89.3%、95.9%和80.4%,特异性分别为58.7%、62.9%和79.1%,以及受试者操作特征曲线下面积分别为0.740(95%置信区间(CI):0.70 - 0.75)、0.794(95% CI:0.72 - 0.83)和0.877(95% CI:0.83 - 0.93)。MUF-Net模型能够准确分类肾实性良恶性肿瘤,并且比高级放射科医生表现更好。CEUS视频数据包含整个肿瘤微循环灌注特征。所提出的基于B模式和CEUS模式图像的MUF-Net能够准确区分肾实性良恶性肿瘤,受试者操作特征曲线下面积为0.877,大大超过了高级放射科医生的评估结果。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/08df/9488515/a9bb62f13fdd/fmolb-09-982703-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验