Hughes-Noehrer Lukas, Channer Leda, Strain Gabriel, Yates Gregory, Body Richard, Jay Caroline
Department of Computer Science, The University of Manchester, Manchester M13 9PL, United Kingdom.
Manchester University NHS Foundation Trust, Manchester M13 9WL, United Kingdom.
JAMIA Open. 2025 Jul 21;8(4):ooaf071. doi: 10.1093/jamiaopen/ooaf071. eCollection 2025 Aug.
OBJECTIVES: To investigate clinicians' attitudes towards current automated interpretation of ECG and novel AI technologies and their perception of computer-assisted interpretation. MATERIALS AND METHODS: We conducted a series of interviews with clinicians in the UK. Our study: (1) explores the potential for AI, specifically future "human-like" computing approaches, to facilitate ECG interpretation and support clinical decision making, and (2) elicits their opinions about the importance of explainability and trustworthiness of AI algorithms. RESULTS: We performed inductive thematic analysis on interview transcriptions from 23 clinicians and identified the following themes: (1) a lack of trust in current systems, (2) positive attitudes towards future AI applications and requirements for these, (3) the relationship between the accuracy and explainability of algorithms, and (4) opinions on education, possible deskilling, and the impact of AI on clinical competencies. DISCUSSION: Clinicians do not trust current computerised methods, but welcome future "AI" technologies. Where clinicians trust future AI interpretation to be accurate, they are less concerned that it is explainable. They also preferred ECG interpretation that demonstrated the results of the algorithm visually. Whilst clinicians do not fear job losses, they are concerned about deskilling and the need to educate the workforce to use AI responsibly. CONCLUSION: Clinicians are positive about the future application of AI in clinical decision-making. Accuracy is a key factor of uptake and visualisations are preferred over current computerised methods. This is viewed as a potential means of training and upskilling, in contrast to the deskilling that automation might be perceived to bring.
目的:调查临床医生对当前心电图自动解读及新型人工智能技术的态度,以及他们对计算机辅助解读的看法。 材料与方法:我们对英国的临床医生进行了一系列访谈。我们的研究:(1)探讨人工智能,特别是未来“类人”计算方法在促进心电图解读和支持临床决策方面的潜力,(2)征求他们对人工智能算法可解释性和可信度重要性的意见。 结果:我们对23名临床医生的访谈转录本进行了归纳主题分析,确定了以下主题:(1)对当前系统缺乏信任,(2)对未来人工智能应用的积极态度及其要求,(3)算法准确性与可解释性之间的关系,(4)对教育、可能出现的技能退化以及人工智能对临床能力影响的看法。 讨论:临床医生不信任当前的计算机化方法,但欢迎未来的“人工智能”技术。当临床医生相信未来的人工智能解读准确时,他们对其是否可解释的担忧就会减少。他们也更喜欢能直观展示算法结果的心电图解读方式。虽然临床医生不担心失业,但他们担心技能退化以及有必要教育工作人员负责任地使用人工智能。 结论:临床医生对人工智能在临床决策中的未来应用持积极态度。准确性是采用的关键因素,与当前的计算机化方法相比,可视化方式更受青睐。这被视为一种潜在的培训和提升技能的手段,与自动化可能带来的技能退化形成对比。
Autism Adulthood. 2025-5-28
Autism Adulthood. 2025-5-28
Autism Adulthood. 2024-12-2
J Med Internet Res. 2024-10-30
Nat Med. 2024-3
Eur Heart J Digit Health. 2023-11-8
Emerg Med Australas. 2024-4
Cardiovasc Digit Health J. 2023-5-3
NPJ Digit Med. 2023-5-22
J Cardiovasc Dev Dis. 2023-4-17