Chen Jinying, Druhl Emily, Polepalli Ramesh Balaji, Houston Thomas K, Brandt Cynthia A, Zulman Donna M, Vimalananda Varsha G, Malkani Samir, Yu Hong
Department of Quantitative Health Sciences, University of Massachusetts Medical School, Worcester, MA, United States.
Bedford Veterans Affairs Medical Center, Center for Healthcare Organization and Implementation Research, Bedford, MA, United States.
J Med Internet Res. 2018 Jan 22;20(1):e26. doi: 10.2196/jmir.8669.
Many health care systems now allow patients to access their electronic health record (EHR) notes online through patient portals. Medical jargon in EHR notes can confuse patients, which may interfere with potential benefits of patient access to EHR notes.
The aim of this study was to develop and evaluate the usability and content quality of NoteAid, a Web-based natural language processing system that links medical terms in EHR notes to lay definitions, that is, definitions easily understood by lay people.
NoteAid incorporates two core components: CoDeMed, a lexical resource of lay definitions for medical terms, and MedLink, a computational unit that links medical terms to lay definitions. We developed innovative computational methods, including an adapted distant supervision algorithm to prioritize medical terms important for EHR comprehension to facilitate the effort of building CoDeMed. Ten physician domain experts evaluated the user interface and content quality of NoteAid. The evaluation protocol included a cognitive walkthrough session and a postsession questionnaire. Physician feedback sessions were audio-recorded. We used standard content analysis methods to analyze qualitative data from these sessions.
Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid's user interface and a number of definitions, and added 4502 more definitions in CoDeMed.
Physician evaluation yielded useful feedback for content validation and refinement of this innovative tool that has the potential to improve patient EHR comprehension and experience using patient portals. Future ongoing work will develop algorithms to handle ambiguous medical terms and test and evaluate NoteAid with patients.
现在许多医疗保健系统允许患者通过患者门户在线访问其电子健康记录(EHR)笔记。EHR笔记中的医学术语可能会使患者感到困惑,这可能会妨碍患者访问EHR笔记的潜在益处。
本研究的目的是开发并评估NoteAid的可用性和内容质量,NoteAid是一个基于网络的自然语言处理系统,它将EHR笔记中的医学术语与通俗易懂的定义(即外行人容易理解的定义)相链接。
NoteAid包含两个核心组件:CoDeMed,一个医学术语通俗易懂定义的词汇资源;以及MedLink,一个将医学术语与通俗易懂定义相链接的计算单元。我们开发了创新的计算方法,包括一种经过改进的远距离监督算法,以确定对EHR理解重要的医学术语的优先级,从而便于构建CoDeMed。十位医生领域专家评估了NoteAid的用户界面和内容质量。评估方案包括一次认知走查会议和会后问卷调查。医生反馈会议进行了录音。我们使用标准的内容分析方法来分析这些会议的定性数据。
医生的反馈褒贬不一。对NoteAid的积极反馈包括:(1)易于使用;(2)良好的视觉显示;(3)令人满意的系统速度;(4)足够的通俗易懂定义。评估会议和反馈中提出的改进机会包括:(1)改进部分匹配术语定义的显示;(2)在CoDeMed中纳入更多医学术语;(3)改进对定义因不同上下文而异的术语的处理;(4)规范药品定义的范围。基于这些结果,我们改进了NoteAid的用户界面和一些定义,并在CoDeMed中增加了4502条定义。
医生评估为这个创新工具的内容验证和完善提供了有用的反馈,该工具有可能提高患者对EHR的理解以及使用患者门户的体验。未来正在进行的工作将开发处理模糊医学术语的算法,并对患者进行NoteAid的测试和评估。