Yan Hongju, Dai Chaochao, Xu Xiaojing, Qiu Yuxuan, Yu Lifang, Huang Lewen, Lin Bei, Huang Jianan, Jiang Chenxiang, Shen Yingzhao, Ji Jing, Li Youcheng, Bao Lingyun
Department of Ultrasound, Affiliated Hangzhou First People's Hospital, School of Medicine, Westlake University, Huansha Road 261, Shangcheng District, Hangzhou, 310006, P. R. China.
Ultrasonography, Zhejiang Chinese Medical University, Hangzhou, China.
Sci Rep. 2025 Apr 6;15(1):11754. doi: 10.1038/s41598-025-95871-5.
To investigate the potential of employing artificial intelligence (AI) -driven breast ultrasound analysis models for the classification of glandular tissue components (GTC) in dense breast tissue. A total of 1,848 healthy women with mammograms classified as dense breast were enrolled in this prospective study. Residual Network (ResNet) 101 classification model and ResNet with fully Convolutional Networks (ResNet + FCN) segmentation model were trained. The better effective model was selected to appraise the classification performance of 3 breast radiologists and 3 non-breast radiologists. The evaluation metrics included sensitivity, specificity, and positive predictive value (PPV). The ResNet101 model demonstrated superior performance compared to the ResNet + FCN model. It significantly enhanced the classification sensitivity of all radiologists by 0.060, 0.021, 0.170, 0.009, 0.052, and 0.047, respectively. For P1 to P4 glandular, the PPVs of all radiologists increased by 0.154, 0.178, 0.027, and 0.109 with Ai-assisted. Notably, the non-breast radiologists experienced a particularly substantial rise in PPV (p < 0.01). This study trained ResNet 101 deep learning model is a reliable and accurate system for assisting different experienced radiologists differentiate dense breast glandular tissue components in ultrasound images.
为了研究使用人工智能(AI)驱动的乳腺超声分析模型对致密乳腺组织中的腺体组织成分(GTC)进行分类的潜力。本前瞻性研究共纳入了1848名乳房X光检查显示为致密乳腺的健康女性。训练了残差网络(ResNet)101分类模型和带有全卷积网络的ResNet(ResNet+FCN)分割模型。选择效果更好的模型来评估3名乳腺放射科医生和3名非乳腺放射科医生的分类性能。评估指标包括敏感性、特异性和阳性预测值(PPV)。与ResNet+FCN模型相比,ResNet101模型表现更优。它分别显著提高了所有放射科医生0.060、0.021、0.170、0.009、0.052和0.047的分类敏感性。对于P1至P4级腺体,在AI辅助下,所有放射科医生的PPV分别提高了0.154、0.178、0.027和0.109。值得注意的是,非乳腺放射科医生的PPV有特别显著的提高(p<0.01)。本研究训练的ResNet 101深度学习模型是一个可靠且准确的系统,可协助不同经验水平的放射科医生在超声图像中区分致密乳腺腺体组织成分。