Suppr超能文献

一种将电子健康记录笔记中的医学术语与通俗定义相链接的自然语言处理系统:利用医生评审进行系统开发。

A Natural Language Processing System That Links Medical Terms in Electronic Health Record Notes to Lay Definitions: System Development Using Physician Reviews.

作者信息

Chen Jinying, Druhl Emily, Polepalli Ramesh Balaji, Houston Thomas K, Brandt Cynthia A, Zulman Donna M, Vimalananda Varsha G, Malkani Samir, Yu Hong

机构信息

Department of Quantitative Health Sciences, University of Massachusetts Medical School, Worcester, MA, United States.

Bedford Veterans Affairs Medical Center, Center for Healthcare Organization and Implementation Research, Bedford, MA, United States.

出版信息

J Med Internet Res. 2018 Jan 22;20(1):e26. doi: 10.2196/jmir.8669.

Abstract

BACKGROUND

Many health care systems now allow patients to access their electronic health record (EHR) notes online through patient portals. Medical jargon in EHR notes can confuse patients, which may interfere with potential benefits of patient access to EHR notes.

OBJECTIVE

The aim of this study was to develop and evaluate the usability and content quality of NoteAid, a Web-based natural language processing system that links medical terms in EHR notes to lay definitions, that is, definitions easily understood by lay people.

METHODS

NoteAid incorporates two core components: CoDeMed, a lexical resource of lay definitions for medical terms, and MedLink, a computational unit that links medical terms to lay definitions. We developed innovative computational methods, including an adapted distant supervision algorithm to prioritize medical terms important for EHR comprehension to facilitate the effort of building CoDeMed. Ten physician domain experts evaluated the user interface and content quality of NoteAid. The evaluation protocol included a cognitive walkthrough session and a postsession questionnaire. Physician feedback sessions were audio-recorded. We used standard content analysis methods to analyze qualitative data from these sessions.

RESULTS

Physician feedback was mixed. Positive feedback on NoteAid included (1) Easy to use, (2) Good visual display, (3) Satisfactory system speed, and (4) Adequate lay definitions. Opportunities for improvement arising from evaluation sessions and feedback included (1) improving the display of definitions for partially matched terms, (2) including more medical terms in CoDeMed, (3) improving the handling of terms whose definitions vary depending on different contexts, and (4) standardizing the scope of definitions for medicines. On the basis of these results, we have improved NoteAid's user interface and a number of definitions, and added 4502 more definitions in CoDeMed.

CONCLUSIONS

Physician evaluation yielded useful feedback for content validation and refinement of this innovative tool that has the potential to improve patient EHR comprehension and experience using patient portals. Future ongoing work will develop algorithms to handle ambiguous medical terms and test and evaluate NoteAid with patients.

摘要

背景

现在许多医疗保健系统允许患者通过患者门户在线访问其电子健康记录(EHR)笔记。EHR笔记中的医学术语可能会使患者感到困惑,这可能会妨碍患者访问EHR笔记的潜在益处。

目的

本研究的目的是开发并评估NoteAid的可用性和内容质量,NoteAid是一个基于网络的自然语言处理系统,它将EHR笔记中的医学术语与通俗易懂的定义(即外行人容易理解的定义)相链接。

方法

NoteAid包含两个核心组件:CoDeMed,一个医学术语通俗易懂定义的词汇资源;以及MedLink,一个将医学术语与通俗易懂定义相链接的计算单元。我们开发了创新的计算方法,包括一种经过改进的远距离监督算法,以确定对EHR理解重要的医学术语的优先级,从而便于构建CoDeMed。十位医生领域专家评估了NoteAid的用户界面和内容质量。评估方案包括一次认知走查会议和会后问卷调查。医生反馈会议进行了录音。我们使用标准的内容分析方法来分析这些会议的定性数据。

结果

医生的反馈褒贬不一。对NoteAid的积极反馈包括:(1)易于使用;(2)良好的视觉显示;(3)令人满意的系统速度;(4)足够的通俗易懂定义。评估会议和反馈中提出的改进机会包括:(1)改进部分匹配术语定义的显示;(2)在CoDeMed中纳入更多医学术语;(3)改进对定义因不同上下文而异的术语的处理;(4)规范药品定义的范围。基于这些结果,我们改进了NoteAid的用户界面和一些定义,并在CoDeMed中增加了4502条定义。

结论

医生评估为这个创新工具的内容验证和完善提供了有用的反馈,该工具有可能提高患者对EHR的理解以及使用患者门户的体验。未来正在进行的工作将开发处理模糊医学术语的算法,并对患者进行NoteAid的测试和评估。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/946e/5799720/305e6ba647f5/jmir_v20i1e26_fig1.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验