Centre for Quantitative Medicine, Duke-NUS Medical School, Singapore.
Department of Biostatistics and Bioinformatics, Duke University, USA.
J Biomed Inform. 2024 Aug;156:104673. doi: 10.1016/j.jbi.2024.104673. Epub 2024 Jun 9.
OBJECTIVE: Pneumothorax is an acute thoracic disease caused by abnormal air collection between the lungs and chest wall. Recently, artificial intelligence (AI), especially deep learning (DL), has been increasingly employed for automating the diagnostic process of pneumothorax. To address the opaqueness often associated with DL models, explainable artificial intelligence (XAI) methods have been introduced to outline regions related to pneumothorax. However, these explanations sometimes diverge from actual lesion areas, highlighting the need for further improvement. METHOD: We propose a template-guided approach to incorporate the clinical knowledge of pneumothorax into model explanations generated by XAI methods, thereby enhancing the quality of the explanations. Utilizing one lesion delineation created by radiologists, our approach first generates a template that represents potential areas of pneumothorax occurrence. This template is then superimposed on model explanations to filter out extraneous explanations that fall outside the template's boundaries. To validate its efficacy, we carried out a comparative analysis of three XAI methods (Saliency Map, Grad-CAM, and Integrated Gradients) with and without our template guidance when explaining two DL models (VGG-19 and ResNet-50) in two real-world datasets (SIIM-ACR and ChestX-Det). RESULTS: The proposed approach consistently improved baseline XAI methods across twelve benchmark scenarios built on three XAI methods, two DL models, and two datasets. The average incremental percentages, calculated by the performance improvements over the baseline performance, were 97.8% in Intersection over Union (IoU) and 94.1% in Dice Similarity Coefficient (DSC) when comparing model explanations and ground-truth lesion areas. We further visualized baseline and template-guided model explanations on radiographs to showcase the performance of our approach. CONCLUSIONS: In the context of pneumothorax diagnoses, we proposed a template-guided approach for improving model explanations. Our approach not only aligns model explanations more closely with clinical insights but also exhibits extensibility to other thoracic diseases. We anticipate that our template guidance will forge a novel approach to elucidating AI models by integrating clinical domain expertise.
目的:气胸是一种由肺与胸壁之间异常气体积聚引起的急性胸部疾病。最近,人工智能(AI),特别是深度学习(DL),已越来越多地用于自动化气胸的诊断过程。为了解决深度学习模型通常存在的不透明性问题,可解释人工智能(XAI)方法已被引入来勾勒与气胸相关的区域。然而,这些解释有时与实际病变区域存在差异,这突出表明需要进一步改进。
方法:我们提出了一种模板引导的方法,将气胸的临床知识纳入 XAI 方法生成的模型解释中,从而提高解释的质量。我们利用放射科医生创建的一个病变勾画,首先生成一个模板,代表气胸发生的潜在区域。然后,将该模板叠加在模型解释上,以过滤掉超出模板边界的无关解释。为了验证其效果,我们在两个真实数据集(SIIM-ACR 和 ChestX-Det)上,对两种深度学习模型(VGG-19 和 ResNet-50)进行了比较分析,比较了三种 XAI 方法(显著性图、Grad-CAM 和 Integrated Gradients)有无我们的模板指导时的效果。
结果:在所构建的基于三种 XAI 方法、两种深度学习模型和两个数据集的 12 个基准场景中,该方法始终提高了基线 XAI 方法的性能。通过与基线性能相比的性能提升计算,在模型解释和ground-truth 病变区域的交并比(IoU)和骰子相似系数(DSC)方面,平均增量百分比分别为 97.8%和 94.1%。我们进一步在 X 光片上可视化了基线和模板引导的模型解释,以展示我们方法的性能。
结论:在气胸诊断方面,我们提出了一种模板引导的方法来改进模型解释。我们的方法不仅使模型解释更紧密地与临床见解保持一致,而且还具有可扩展性,可应用于其他胸部疾病。我们预计,我们的模板指导将通过整合临床领域专业知识,为阐明人工智能模型开辟一种新途径。
Quant Imaging Med Surg. 2025-7-1
J Med Internet Res. 2024-12-24
2025-1
Ultrasound Med Biol. 2025-6-28
Health Care Sci. 2024-12-15