Suppr超能文献

解码黑箱:癌症诊断、预后和治疗计划的可解释人工智能(XAI)——最新系统评价。

Decoding the black box: Explainable AI (XAI) for cancer diagnosis, prognosis, and treatment planning-A state-of-the art systematic review.

机构信息

School of Electrical & Electronic Engineering, Engineering Campus, Universiti Sains Malaysia (USM), Malaysia.

School of Electrical & Electronic Engineering, Engineering Campus, Universiti Sains Malaysia (USM), Malaysia.

出版信息

Int J Med Inform. 2025 Jan;193:105689. doi: 10.1016/j.ijmedinf.2024.105689. Epub 2024 Nov 4.

Abstract

OBJECTIVE

Explainable Artificial Intelligence (XAI) is increasingly recognized as a crucial tool in cancer care, with significant potential to enhance diagnosis, prognosis, and treatment planning. However, the holistic integration of XAI across all stages of cancer care remains underexplored. This review addresses this gap by systematically evaluating the role of XAI in these critical areas, identifying key challenges and emerging trends.

MATERIALS AND METHODS

Following the PRISMA guidelines, a comprehensive literature search was conducted across Scopus and Web of Science, focusing on publications from January 2020 to May 2024. After rigorous screening and quality assessment, 69 studies were selected for in-depth analysis.

RESULTS

The review identified critical gaps in the application of XAI within cancer care, notably the exclusion of clinicians in 83% of studies, which raises concerns about real-world applicability and may lead to explanations that are technically sound but clinically irrelevant. Additionally, 87% of studies lacked rigorous evaluation of XAI explanations, compromising their reliability in clinical practice. The dominance of post-hoc visual methods like SHAP, LIME and Grad-CAM reflects a trend toward explanations that may be inherently flawed due to specific input perturbations and simplifying assumptions. The lack of formal evaluation metrics and standardization constrains broader XAI adoption in clinical settings, creating a disconnect between AI development and clinical integration. Moreover, translating XAI insights into actionable clinical decisions remains challenging due to the absence of clear guidelines for integrating these tools into clinical workflows.

CONCLUSION

This review highlights the need for greater clinician involvement, standardized XAI evaluation metrics, clinician-centric interfaces, context-aware XAI systems, and frameworks for integrating XAI into clinical workflows for informed clinical decision-making and improved outcomes in cancer care.

摘要

目的

可解释人工智能(XAI)越来越被认为是癌症护理中的重要工具,具有提高诊断、预后和治疗计划的巨大潜力。然而,XAI 在癌症护理所有阶段的整体整合仍未得到充分探索。本综述通过系统评估 XAI 在这些关键领域中的作用,确定关键挑战和新兴趋势,弥补了这一空白。

材料与方法

根据 PRISMA 指南,在 Scopus 和 Web of Science 上进行了全面的文献检索,重点关注 2020 年 1 月至 2024 年 5 月期间的出版物。经过严格的筛选和质量评估,选择了 69 项研究进行深入分析。

结果

该综述确定了 XAI 在癌症护理中的应用存在关键差距,特别是 83%的研究排除了临床医生,这引发了对实际应用的担忧,可能导致技术上合理但临床上无关的解释。此外,87%的研究缺乏对 XAI 解释的严格评估,从而降低了其在临床实践中的可靠性。后处理可视化方法(如 SHAP、LIME 和 Grad-CAM)的主导地位反映了一种趋势,即解释可能由于特定输入干扰和简化假设而存在固有缺陷。缺乏正式的评估指标和标准化限制了 XAI 在临床环境中的更广泛采用,在人工智能开发和临床整合之间造成了脱节。此外,由于缺乏将这些工具集成到临床工作流程中的明确指南,将 XAI 见解转化为可操作的临床决策仍然具有挑战性。

结论

本综述强调了需要更多临床医生的参与、标准化的 XAI 评估指标、以临床医生为中心的界面、上下文感知的 XAI 系统以及将 XAI 集成到临床工作流程中的框架,以实现基于证据的临床决策和改善癌症护理的结果。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验