Dorrestein Linda, Ritter Caroline, De Mol Zoë, Wichtel Maureen, Cary Julie, Vengrin Courtney, Artemiou Elpida, Adams Cindy L, Ganshorn Heather, Coe Jason B, Barkema H, Hecker Kent G
University of Calgary, Calgary, Alberta, Canada.
University of Prince Edward Island, Charlottetown, Prince Edward Island, Canada.
BMJ Open. 2025 Sep 5;15(9):e096799. doi: 10.1136/bmjopen-2024-096799.
Communication skills assessment (CSA) is essential for ensuring competency, guiding educational practices and safeguarding regulatory compliance in health professions education (HPE). However, there appears to be heterogeneity in the reporting of validity evidence from CSA methods across the health profession that complicates our interpretation of the quality of assessment methods. Our objective was to map reliability and validity evidence from scores of CSA methods that have been reported in HPE.
Scoping review.
MEDLINE, Embase, PsycINFO, CINAHL, ERIC, CAB Abstracts and Scopus databases were searched up to March 2024.
We included studies, available in English, that reported validity evidence (content-related, internal structure, relationship with other variables, response processes and consequences) for CSA methods in HPE. There were no restrictions related to date of publication.
Two independent reviewers completed data extraction and assessed study quality using the Medical Education Research Study Quality Instrument. Data were reported using descriptive analysis (mean, median, range).
A total of 146 eligible studies were identified, including 98 394 participants. Most studies were conducted in human medicine (124 studies) and participants were mostly undergraduate students (85 studies). Performance-based, simulated, inperson CSA was most prevalent, comprising 115 studies, of which 68 studies were objective structured clinical examination-based. Other types of methods that were reported were workplace-based assessment; asynchronous, video-based assessment; knowledge-based assessment and performance-based, simulated, virtual assessment. Included studies used a diverse range of communications skills frameworks, rating scales and raters. Internal structure was the most reported source of validity evidence (130 studies (90%), followed by content-related (108 studies (74%), relationships with other variables (86 studies (59%), response processes (15 studies (10%) and consequences (16 studies (11%).
This scoping review identified gaps in the sources of validity evidence related to assessment method that have been used to support the use of CSA methods. These gaps could be addressed by studies explicitly defining the communication skill construct(s) assessed, clarifying the validity source(s) reported and defining the intended purpose and use of the scores (ie, for learning and feedback, for decision making purposes). Our review provides a map where targeted CSA development and support are needed. Limitations of the evidence come from score interpretation being constrained by the heterogeneity of the definition of communication skills across the health professions and the reporting quality of the studies.
沟通技能评估(CSA)对于确保卫生专业教育(HPE)中的能力、指导教育实践和维护监管合规性至关重要。然而,卫生专业领域内CSA方法的效度证据报告似乎存在异质性,这使得我们对评估方法质量的解释变得复杂。我们的目的是梳理HPE中已报告的CSA方法分数的信度和效度证据。
范围综述。
检索了截至2024年3月的MEDLINE、Embase、PsycINFO、CINAHL、ERIC、CAB文摘和Scopus数据库。
我们纳入了以英文发表的研究,这些研究报告了HPE中CSA方法的效度证据(与内容相关、内部结构、与其他变量的关系、反应过程和结果)。对发表日期没有限制。
两名独立评审员完成数据提取,并使用医学教育研究质量工具评估研究质量。数据采用描述性分析(均值、中位数、范围)进行报告。
共识别出146项符合条件的研究,包括98394名参与者。大多数研究是在医学领域进行的(124项研究),参与者大多是本科生(85项研究)。基于表现的、模拟的、面对面的CSA最为普遍,包括115项研究,其中68项研究基于客观结构化临床考试。报告的其他方法类型包括基于工作场所的评估;异步的、基于视频的评估;基于知识的评估以及基于表现的、模拟的、虚拟评估。纳入的研究使用了多种沟通技能框架、评分量表和评分者。内部结构是报告最多的效度证据来源(130项研究(90%)),其次是与内容相关的(108项研究(74%))、与其他变量的关系(86项研究(59%))、反应过程(15项研究(10%))和结果(16项研究(11%))。
这项范围综述发现了与用于支持CSA方法使用的评估方法相关的效度证据来源方面的差距。这些差距可以通过明确界定所评估的沟通技能结构、阐明所报告的效度来源以及界定分数的预期目的和用途(即用于学习和反馈、用于决策目的)的研究来解决。我们的综述提供了一张需要针对性的CSA开发和支持的地图。证据的局限性来自于分数解释受到卫生专业领域沟通技能定义的异质性以及研究报告质量的限制。