Zhang Shaodian, Kang Tian, Zhang Xingting, Wen Dong, Elhadad Noémie, Lei Jianbo
Department of Biomedical Informatics, Columbia University, New York, USA.
Center for Medical Informatics, Peking University, Beijing, China.
J Biomed Inform. 2016 Apr;60:334-41. doi: 10.1016/j.jbi.2016.02.011. Epub 2016 Feb 26.
Speculations represent uncertainty toward certain facts. In clinical texts, identifying speculations is a critical step of natural language processing (NLP). While it is a nontrivial task in many languages, detecting speculations in Chinese clinical notes can be particularly challenging because word segmentation may be necessary as an upstream operation. The objective of this paper is to construct a state-of-the-art speculation detection system for Chinese clinical notes and to investigate whether embedding features and word segmentations are worth exploiting toward this overall task. We propose a sequence labeling based system for speculation detection, which relies on features from bag of characters, bag of words, character embedding, and word embedding. We experiment on a novel dataset of 36,828 clinical notes with 5103 gold-standard speculation annotations on 2000 notes, and compare the systems in which word embeddings are calculated based on word segmentations given by general and by domain specific segmenters respectively. Our systems are able to reach performance as high as 92.2% measured by F score. We demonstrate that word segmentation is critical to produce high quality word embedding to facilitate downstream information extraction applications, and suggest that a domain dependent word segmenter can be vital to such a clinical NLP task in Chinese language.
推测表示对某些事实的不确定性。在临床文本中,识别推测是自然语言处理(NLP)的关键步骤。虽然在许多语言中这都是一项艰巨的任务,但在中文临床记录中检测推测可能特别具有挑战性,因为分词可能是上游操作的必要步骤。本文的目的是构建一个用于中文临床记录的先进推测检测系统,并研究嵌入特征和分词对于这一总体任务是否值得利用。我们提出了一种基于序列标注的推测检测系统,该系统依赖于字符袋、词袋、字符嵌入和词嵌入的特征。我们在一个包含36,828条临床记录的新数据集上进行实验,其中2000条记录有5103个金标准推测注释,并比较了分别基于通用分词器和领域特定分词器给出的分词来计算词嵌入的系统。我们的系统能够达到F值测量高达92.2%的性能。我们证明分词对于生成高质量的词嵌入以促进下游信息提取应用至关重要,并表明领域相关的分词器对于中文临床NLP任务可能至关重要。