• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用预训练语言模型对心境障碍患者精神疾病家族史的识别与影响分析

Identification and Impact Analysis of Family History of Psychiatric Disorder in Mood Disorder Patients With Pretrained Language Model.

作者信息

Wan Cheng, Ge Xuewen, Wang Junjie, Zhang Xin, Yu Yun, Hu Jie, Liu Yun, Ma Hui

机构信息

Department of Medical Informatics, School of Biomedical Engineering and Informatics, Nanjing Medical University, Nanjing, China.

Institute of Medical Informatics and Management, Nanjing Medical University, Nanjing, China.

出版信息

Front Psychiatry. 2022 May 20;13:861930. doi: 10.3389/fpsyt.2022.861930. eCollection 2022.

DOI:10.3389/fpsyt.2022.861930
PMID:35669265
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9163373/
Abstract

Mood disorders are ubiquitous mental disorders with familial aggregation. Extracting family history of psychiatric disorders from large electronic hospitalization records is helpful for further study of onset characteristics among patients with a mood disorder. This study uses an observational clinical data set of in-patients of Nanjing Brain Hospital, affiliated with Nanjing Medical University, from the past 10 years. This paper proposes a pretrained language model: Bidirectional Encoder Representations from Transformers (BERT)-Convolutional Neural Network (CNN). We first project the electronic hospitalization records into a low-dimensional dense matrix the pretrained Chinese BERT model, then feed the dense matrix into the stacked CNN layer to capture high-level features of texts; finally, we use the fully connected layer to extract family history based on high-level features. The accuracy of our BERT-CNN model was 97.12 ± 0.37% in the real-world data set from Nanjing Brain Hospital. We further studied the correlation between mood disorders and family history of psychiatric disorder.

摘要

情绪障碍是具有家族聚集性的常见精神障碍。从大型电子住院记录中提取精神疾病家族史有助于进一步研究情绪障碍患者的发病特征。本研究使用了南京医科大学附属南京脑科医院过去10年住院患者的观察性临床数据集。本文提出了一种预训练语言模型:基于变换器的双向编码器表征(BERT)-卷积神经网络(CNN)。我们首先将电子住院记录通过预训练的中文BERT模型投影到低维密集矩阵中,然后将密集矩阵输入到堆叠的CNN层以捕获文本的高级特征;最后,我们使用全连接层基于高级特征提取家族史。在南京脑科医院的真实世界数据集中,我们的BERT-CNN模型的准确率为97.12±0.37%。我们进一步研究了情绪障碍与精神疾病家族史之间的相关性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d8a/9163373/f4fbf621dcb9/fpsyt-13-861930-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d8a/9163373/3a458a1e8034/fpsyt-13-861930-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d8a/9163373/f4fbf621dcb9/fpsyt-13-861930-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d8a/9163373/3a458a1e8034/fpsyt-13-861930-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2d8a/9163373/f4fbf621dcb9/fpsyt-13-861930-g0003.jpg

相似文献

1
Identification and Impact Analysis of Family History of Psychiatric Disorder in Mood Disorder Patients With Pretrained Language Model.使用预训练语言模型对心境障碍患者精神疾病家族史的识别与影响分析
Front Psychiatry. 2022 May 20;13:861930. doi: 10.3389/fpsyt.2022.861930. eCollection 2022.
2
Extracting clinical named entity for pituitary adenomas from Chinese electronic medical records.从中文电子病历中提取垂体腺瘤的临床命名实体。
BMC Med Inform Decis Mak. 2022 Mar 23;22(1):72. doi: 10.1186/s12911-022-01810-z.
3
Relation Classification for Bleeding Events From Electronic Health Records Using Deep Learning Systems: An Empirical Study.使用深度学习系统对电子健康记录中的出血事件进行关系分类:一项实证研究。
JMIR Med Inform. 2021 Jul 2;9(7):e27527. doi: 10.2196/27527.
4
Multi-Label Classification in Patient-Doctor Dialogues With the RoBERTa-WWM-ext + CNN (Robustly Optimized Bidirectional Encoder Representations From Transformers Pretraining Approach With Whole Word Masking Extended Combining a Convolutional Neural Network) Model: Named Entity Study.基于RoBERTa-WWM-ext + CNN(带有全词掩码扩展的基于变换器预训练方法的稳健优化双向编码器表示与卷积神经网络相结合)模型的医患对话多标签分类:命名实体研究
JMIR Med Inform. 2022 Apr 21;10(4):e35606. doi: 10.2196/35606.
5
Semantic Textual Similarity in Japanese Clinical Domain Texts Using BERT.基于 BERT 的日本临床领域文本的语义文本相似性研究
Methods Inf Med. 2021 Jun;60(S 01):e56-e64. doi: 10.1055/s-0041-1731390. Epub 2021 Jul 8.
6
Developing Artificial Intelligence Models for Extracting Oncologic Outcomes from Japanese Electronic Health Records.开发人工智能模型,从日本电子健康记录中提取肿瘤学结局。
Adv Ther. 2023 Mar;40(3):934-950. doi: 10.1007/s12325-022-02397-7. Epub 2022 Dec 22.
7
BERT-Kgly: A Bidirectional Encoder Representations From Transformers (BERT)-Based Model for Predicting Lysine Glycation Site for .BERT-Kgly:一种基于双向编码器表征变换器(BERT)的赖氨酸糖基化位点预测模型
Front Bioinform. 2022 Feb 18;2:834153. doi: 10.3389/fbinf.2022.834153. eCollection 2022.
8
Extracting comprehensive clinical information for breast cancer using deep learning methods.利用深度学习方法提取乳腺癌全面临床信息。
Int J Med Inform. 2019 Dec;132:103985. doi: 10.1016/j.ijmedinf.2019.103985. Epub 2019 Oct 2.
9
An Evaluation of Pretrained BERT Models for Comparing Semantic Similarity Across Unstructured Clinical Trial Texts.基于预训练 BERT 模型评估非结构化临床试验文本间语义相似度的比较
Stud Health Technol Inform. 2022 Jan 14;289:18-21. doi: 10.3233/SHTI210848.
10
When BERT meets Bilbo: a learning curve analysis of pretrained language model on disease classification.当 BERT 遇见比尔博:预训练语言模型在疾病分类上的学习曲线分析。
BMC Med Inform Decis Mak. 2022 Apr 5;21(Suppl 9):377. doi: 10.1186/s12911-022-01829-2.

引用本文的文献

1
The Applications of Large Language Models in Mental Health: Scoping Review.大语言模型在心理健康领域的应用:范围综述
J Med Internet Res. 2025 May 5;27:e69284. doi: 10.2196/69284.
2
Depression among general outpatient department attendees in selected hospitals in Somalia: magnitude and associated factors.索马里选定医院普通门诊就诊者中的抑郁状况:严重程度及相关因素。
BMC Psychiatry. 2024 Aug 27;24(1):579. doi: 10.1186/s12888-024-06020-7.

本文引用的文献

1
Relation Classification for Bleeding Events From Electronic Health Records Using Deep Learning Systems: An Empirical Study.使用深度学习系统对电子健康记录中的出血事件进行关系分类:一项实证研究。
JMIR Med Inform. 2021 Jul 2;9(7):e27527. doi: 10.2196/27527.
2
Novel Risk Loci Associated With Genetic Risk for Bipolar Disorder Among Han Chinese Individuals: A Genome-Wide Association Study and Meta-analysis.汉族人群中与双相情感障碍遗传风险相关的新型风险基因座:全基因组关联研究和荟萃分析。
JAMA Psychiatry. 2021 Mar 1;78(3):320-330. doi: 10.1001/jamapsychiatry.2020.3738.
3
Chinese clinical named entity recognition with variant neural structures based on BERT methods.
基于 BERT 方法的中文临床命名实体识别与变体神经结构。
J Biomed Inform. 2020 Jul;107:103422. doi: 10.1016/j.jbi.2020.103422. Epub 2020 Apr 28.
4
Temporal information extraction from mental health records to identify duration of untreated psychosis.从心理健康记录中提取时间信息,以确定未治疗精神病的持续时间。
J Biomed Semantics. 2020 Mar 10;11(1):2. doi: 10.1186/s13326-020-00220-2.
5
Evaluating sentence representations for biomedical text: Methods and experimental results.评价生物医学文本的句子表示方法及实验结果。
J Biomed Inform. 2020 Apr;104:103396. doi: 10.1016/j.jbi.2020.103396. Epub 2020 Mar 6.
6
Multiple features for clinical relation extraction: A machine learning approach.临床关系抽取的多特征:一种机器学习方法。
J Biomed Inform. 2020 Mar;103:103382. doi: 10.1016/j.jbi.2020.103382. Epub 2020 Feb 3.
7
An electronic family health history tool to identify and manage patients at increased risk for colorectal cancer: protocol for a randomized controlled trial.一种用于识别和管理结直肠癌风险增加患者的电子家庭健康史工具:一项随机对照试验的方案。
Trials. 2019 Oct 7;20(1):576. doi: 10.1186/s13063-019-3659-y.
8
A frame semantic overview of NLP-based information extraction for cancer-related EHR notes.基于框架语义的自然语言处理信息抽取在癌症相关电子病历中的应用综述。
J Biomed Inform. 2019 Dec;100:103301. doi: 10.1016/j.jbi.2019.103301. Epub 2019 Oct 4.
9
Fine-Tuning Bidirectional Encoder Representations From Transformers (BERT)-Based Models on Large-Scale Electronic Health Record Notes: An Empirical Study.基于大规模电子健康记录笔记对基于变换器的双向编码器表征(BERT)模型进行微调:一项实证研究。
JMIR Med Inform. 2019 Sep 12;7(3):e14830. doi: 10.2196/14830.
10
BioBERT: a pre-trained biomedical language representation model for biomedical text mining.BioBERT:一种用于生物医学文本挖掘的预训练生物医学语言表示模型。
Bioinformatics. 2020 Feb 15;36(4):1234-1240. doi: 10.1093/bioinformatics/btz682.