Kang Sunghoon, Park Hyewon, Taira Ricky, Kim Hyeoneui
College of Nursing, Seoul National University, 103 Daehak-ro, Jongno-gu, Seoul, 03080, Republic of Korea, 82 027408483.
The Department of Radiological Sciences, David Geffen School of Medicine, University of California, Los Angeles, Los Angeles, CA, United States.
JMIR Med Inform. 2025 Jun 10;13:e71687. doi: 10.2196/71687.
BACKGROUND: As the importance of person-generated health data (PGHD) in health care and research has increased, efforts have been made to standardize survey-based PGHD to improve its usability and interoperability. Standardization efforts such as the Patient-Reported Outcomes Measurement Information System (PROMIS) and the National Institutes of Health (NIH) Common Data Elements (CDE) repository provide effective tools for managing and unifying health survey questions. However, previous methods using ontology-mediated annotation are not only labor-intensive and difficult to scale but also challenging for identifying semantic redundancies in survey questions, especially across multiple languages. OBJECTIVE: The goal of this work was to compute the semantic similarity among publicly available health survey questions to facilitate the standardization of survey-based PGHD. METHODS: We compiled various health survey questions authored in both English and Korean from the NIH CDE repository, PROMIS, Korean public health agencies, and academic publications. Questions were drawn from various health lifelog domains. A randomized question pairing scheme was used to generate a semantic text similarity dataset consisting of 1758 question pairs. The similarity scores between each question pair were assigned by 2 human experts. The tagged dataset was then used to build 4 classifiers featuring bag-of-words, sentence-bidirectional encoder representations from transformers (SBERT) with bidirectional encoder representations from transformers (BERT)-based embeddings, SBERT with language-agnostic BERT sentence embedding (LaBSE), and GPT-4o. The algorithms were evaluated using traditional contingency statistics. RESULTS: Among the 3 algorithms, SBERT-LaBSE demonstrated the highest performance in assessing the question similarity across both languages, achieving area under the receiver operating characteristic and precision-recall curves of >0.99. Additionally, SBERT-LaBSE proved effective in identifying cross-lingual semantic similarities. The SBERT-LaBSE algorithm excelled at aligning semantically equivalent sentences across both languages but encountered challenges in capturing subtle nuances and maintaining computational efficiency. Future research should focus on testing with larger multilingual datasets and on calibrating and normalizing scores across the health lifelog domains to improve consistency. CONCLUSIONS: This study introduces the SBERT-LaBSE algorithm for calculating the semantic similarity across 2 languages, showing that it outperforms BERT-based models, the GPT-4o model, and the bag-of-words approach, highlighting its potential in improving the semantic interoperability of survey-based PGHD across language barriers.
背景:随着个人生成的健康数据(PGHD)在医疗保健和研究中的重要性日益增加,人们已努力对基于调查的PGHD进行标准化,以提高其可用性和互操作性。诸如患者报告结果测量信息系统(PROMIS)和美国国立卫生研究院(NIH)通用数据元素(CDE)存储库等标准化工作为管理和统一健康调查问题提供了有效的工具。然而,以前使用本体介导注释的方法不仅劳动强度大且难以扩展,而且在识别调查问题中的语义冗余方面具有挑战性,尤其是在多种语言之间。 目的:这项工作的目标是计算公开可用的健康调查问题之间的语义相似性,以促进基于调查的PGHD的标准化。 方法:我们从NIH CDE存储库、PROMIS、韩国公共卫生机构和学术出版物中汇编了用英语和韩语撰写的各种健康调查问题。问题来自各种健康生活日志领域。采用随机问题配对方案生成了一个由1758个问题对组成的语义文本相似性数据集。每个问题对之间的相似性得分由两名人类专家给出。然后,使用标记的数据集构建4个分类器,其特征分别为词袋模型、基于双向编码器表征变换器(BERT)嵌入的变换器双向编码器表征(SBERT)、具有与语言无关的BERT句子嵌入(LaBSE)的SBERT以及GPT-4o。使用传统的列联统计对算法进行评估。 结果:在这3种算法中,SBERT-LaBSE在评估两种语言的问题相似性方面表现出最高的性能,其受试者工作特征曲线下面积和精确召回率曲线均大于0.99。此外,SBERT-LaBSE在识别跨语言语义相似性方面被证明是有效的。SBERT-LaBSE算法擅长对齐两种语言中语义等效的句子,但在捕捉细微差别和保持计算效率方面遇到了挑战。未来的研究应侧重于使用更大的多语言数据集进行测试,并对健康生活日志领域的分数进行校准和归一化,以提高一致性。 结论:本研究介绍了用于计算两种语言之间语义相似性的SBERT-LaBSE算法,表明它优于基于BERT的模型、GPT-4o模型和词袋方法,突出了其在跨越语言障碍提高基于调查的PGHD语义互操作性方面的潜力。
J Med Internet Res. 2025-1-23
Cochrane Database Syst Rev. 2025-6-9
Cochrane Database Syst Rev. 2025-4-14
BMC Med Inform Decis Mak. 2023-4-6
Methods Inf Med. 2021-6
Stud Health Technol Inform. 2021-5-27
JMIR Med Inform. 2020-11-23
BMC Med Inform Decis Mak. 2019-8-7
J Am Med Inform Assoc. 2019-10-1