Department of Electrical and Electronic Engineering, Bangladesh University of Engineering and Technology (BUET), Dhaka 1205, Bangladesh.
Phys Med Biol. 2023 Dec 26;69(1). doi: 10.1088/1361-6560/ad1319.
Breast cancer is the major cause of cancer death among women worldwide. Deep learning-based computer-aided diagnosis (CAD) systems for classifying lesions in breast ultrasound images can help materialise the early detection of breast cancer and enhance survival chances.This paper presents a completely automated BUS diagnosis system with modular convolutional neural networks tuned with novel loss functions. The proposed network comprises a dynamic channel input enhancement network, an attention-guided InceptionV3-based feature extraction network, a classification network, and a parallel feature transformation network to map deep features into quantitative ultrasound (QUS) feature space. These networks function together to improve classification accuracy by increasing the separation of benign and malignant class-specific features and enriching them simultaneously. Unlike the categorical crossentropy (CCE) loss-based traditional approaches, our method uses two additional novel losses: class activation mapping (CAM)-based and QUS feature-based losses, to capacitate the overall network learn the extraction of clinically valued lesion shape and texture-related properties focusing primarily the lesion area for explainable AI (XAI).Experiments on four public, one private, and a combined breast ultrasound dataset are used to validate our strategy. The suggested technique obtains an accuracy of 97.28%, sensitivity of 93.87%, F1-score of 95.42% on dataset 1 (BUSI), and an accuracy of 91.50%, sensitivity of 89.38%, and F1-score of 89.31% on the combined dataset, consisting of 1494 images collected from hospitals in five demographic locations using four ultrasound systems of different manufacturers. These results outperform techniques reported in the literature by a considerable margin.The proposed CAD system provides diagnosis from the auto-focused lesion area of B-mode BUS images, avoiding the explicit requirement of any segmentation or region of interest extraction, and thus can be a handy tool for making accurate and reliable diagnoses even in unspecialized healthcare centers.
乳腺癌是全球女性癌症死亡的主要原因。基于深度学习的计算机辅助诊断(CAD)系统可用于对乳腺超声图像中的病变进行分类,有助于实现乳腺癌的早期检测,提高生存机会。本文提出了一种完全自动化的 BUS 诊断系统,采用具有新型损失函数的模块化卷积神经网络进行调优。所提出的网络由动态通道输入增强网络、基于注意力引导的 InceptionV3 的特征提取网络、分类网络和并行特征变换网络组成,用于将深度特征映射到定量超声(QUS)特征空间。这些网络共同作用,通过增加良性和恶性特定类别的特征的分离度并同时丰富它们,提高分类准确性。与基于类别交叉熵(CCE)损失的传统方法不同,我们的方法使用两种额外的新型损失:基于类激活映射(CAM)和 QUS 特征的损失,使整个网络能够学习提取具有临床价值的病变形状和纹理相关属性,主要关注病变区域,以实现可解释的人工智能(XAI)。在四个公共数据集、一个私有数据集和一个合并的乳腺超声数据集上进行实验,验证了我们的策略。该方法在数据集 1(BUSI)上的准确率为 97.28%,灵敏度为 93.87%,F1 得分为 95.42%,在包含来自五个地理位置的医院的 1494 张图像的合并数据集上的准确率为 91.50%,灵敏度为 89.38%,F1 得分为 89.31%,优于文献中报道的技术。所提出的 CAD 系统从 B 模式 BUS 图像的自动聚焦病变区域提供诊断,避免了任何分割或感兴趣区域提取的明确要求,因此即使在非专业医疗中心也可以成为进行准确可靠诊断的有用工具。