• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

DPRM:基于DeBERTa的潜在关系多头自注意力联合提取模型。

DPRM: DeBERTa-based potential relationship multi-headed self-attention joint extraction model.

作者信息

Li Songjiang, Cao Jinming, Yang Jiao, He Yunjiangcan, Wang Peng

机构信息

College of Computer Science and Technology, Changchun University of Science and Technology, Changchun, China.

Chongqing Research Institute, Changchun University of Science and Technolo-gy, Chongqing, China.

出版信息

PLoS One. 2025 Aug 5;20(8):e0329120. doi: 10.1371/journal.pone.0329120. eCollection 2025.

DOI:10.1371/journal.pone.0329120
PMID:40763147
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12324142/
Abstract

Traditional entity-relationship joint extraction models are typically designed to address generic domain data, which limits their effectiveness when applied to domain-specific applications such as manufacturing. This study presents the DeBERTa-based Potential Relationship Multi-Headed Self-Attention Joint Extraction Model (DPRM), which has been specifically designed to enhance the accuracy and efficiency of entity-relationship extraction in manufacturing knowledge graphs. The model is comprised of three core components: a semantic representation module, a relationship extraction and entity recognition module, and a global entity pairing module. In the semantic representation module, a DeBERTa encoder is employed to train the input sentences, thereby generating word embeddings. The capture of word dependencies is achieved through the utilization of Bi-GRU and Multi-Headed Self-Attention mechanisms, which serve to enhance the overall representation of the sentence. The relationship extraction and entity recognition module is responsible for identifying potential relationships within the sentences and integrating a relational gated mechanism to minimize the interference of irrelevant information during the entity recognition process. The global entity pairing module simplifies the model's architecture by extracting potential relationships and constructing a matrix of global pairing entity pairs based on fault-specific data. The efficacy of the proposed model is validated through experiments conducted on fault datasets. The results demonstrate that the DPRM achieves superior performance, with an F1 score that surpasses that of existing models, thereby highlighting its effectiveness in the fault domain.

摘要

传统的实体关系联合提取模型通常设计用于处理通用领域数据,这限制了它们在应用于特定领域应用(如制造业)时的有效性。本研究提出了基于DeBERTa的潜在关系多头自注意力联合提取模型(DPRM),该模型专门设计用于提高制造知识图谱中实体关系提取的准确性和效率。该模型由三个核心组件组成:语义表示模块、关系提取与实体识别模块以及全局实体配对模块。在语义表示模块中,使用DeBERTa编码器对输入句子进行训练,从而生成词嵌入。通过利用双向门控循环单元(Bi-GRU)和多头自注意力机制来捕获词依赖关系,这有助于增强句子的整体表示。关系提取与实体识别模块负责识别句子中的潜在关系,并集成关系门控机制以最小化实体识别过程中无关信息的干扰。全局实体配对模块通过提取潜在关系并基于特定故障数据构建全局配对实体对矩阵来简化模型架构。通过在故障数据集上进行的实验验证了所提出模型的有效性。结果表明,DPRM取得了卓越的性能,其F1分数超过了现有模型,从而突出了其在故障领域的有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/d74d63d69412/pone.0329120.g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/0bb658e85617/pone.0329120.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/e2565908214c/pone.0329120.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/405f6ee88c2a/pone.0329120.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/318adb51f2be/pone.0329120.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/ee3fc755782b/pone.0329120.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/95172377ae8a/pone.0329120.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/6712e1cedc64/pone.0329120.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/5483735e484e/pone.0329120.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/5be23093e60b/pone.0329120.g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/1e2d7c5b005f/pone.0329120.g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/ca791e9689fa/pone.0329120.g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/5ecae31f5c26/pone.0329120.g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/7e096948bf96/pone.0329120.g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/d74d63d69412/pone.0329120.g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/0bb658e85617/pone.0329120.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/e2565908214c/pone.0329120.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/405f6ee88c2a/pone.0329120.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/318adb51f2be/pone.0329120.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/ee3fc755782b/pone.0329120.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/95172377ae8a/pone.0329120.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/6712e1cedc64/pone.0329120.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/5483735e484e/pone.0329120.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/5be23093e60b/pone.0329120.g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/1e2d7c5b005f/pone.0329120.g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/ca791e9689fa/pone.0329120.g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/5ecae31f5c26/pone.0329120.g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/7e096948bf96/pone.0329120.g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b471/12324142/d74d63d69412/pone.0329120.g014.jpg

相似文献

1
DPRM: DeBERTa-based potential relationship multi-headed self-attention joint extraction model.DPRM:基于DeBERTa的潜在关系多头自注意力联合提取模型。
PLoS One. 2025 Aug 5;20(8):e0329120. doi: 10.1371/journal.pone.0329120. eCollection 2025.
2
Short-Term Memory Impairment短期记忆障碍
3
BAMRE: Joint extraction model of Chinese medical entities and relations based on Biaffine transformation with relation attention.基于关系注意力的双线性变换的中文医疗实体和关系联合抽取模型。
J Biomed Inform. 2024 Oct;158:104733. doi: 10.1016/j.jbi.2024.104733. Epub 2024 Oct 3.
4
Knowledge Graph-Enhanced Deep Learning Model (H-SYSTEM) for Hypertensive Intracerebral Hemorrhage: Model Development and Validation.用于高血压性脑出血的知识图谱增强深度学习模型(H-SYSTEM):模型开发与验证
J Med Internet Res. 2025 Jun 12;27:e66055. doi: 10.2196/66055.
5
Graph neural networks embedded with domain knowledge for cyber threat intelligence entity and relationship mining.嵌入领域知识的图神经网络用于网络威胁情报实体与关系挖掘。
PeerJ Comput Sci. 2025 Apr 4;11:e2769. doi: 10.7717/peerj-cs.2769. eCollection 2025.
6
Management of urinary stones by experts in stone disease (ESD 2025).结石病专家对尿路结石的管理(2025年结石病专家共识)
Arch Ital Urol Androl. 2025 Jun 30;97(2):14085. doi: 10.4081/aiua.2025.14085.
7
Comparison of Two Modern Survival Prediction Tools, SORG-MLA and METSSS, in Patients With Symptomatic Long-bone Metastases Who Underwent Local Treatment With Surgery Followed by Radiotherapy and With Radiotherapy Alone.两种现代生存预测工具 SORG-MLA 和 METSSS 在接受手术联合放疗和单纯放疗治疗有症状长骨转移患者中的比较。
Clin Orthop Relat Res. 2024 Dec 1;482(12):2193-2208. doi: 10.1097/CORR.0000000000003185. Epub 2024 Jul 23.
8
Semantic classification of Indonesian consumer health questions.印度尼西亚消费者健康问题的语义分类。
J Biomed Semantics. 2025 Jul 28;16(1):13. doi: 10.1186/s13326-025-00334-5.
9
ParTRE: A relational triple extraction model of complicated entities and imbalanced relations in Parkinson's disease.ParTRE:帕金森病中复杂实体和不平衡关系的关系三元组抽取模型。
J Biomed Inform. 2024 Apr;152:104624. doi: 10.1016/j.jbi.2024.104624. Epub 2024 Mar 11.
10
HEART: Learning better representation of EHR data with a heterogeneous relation-aware transformer.心脏:使用异构关系感知转换器学习更好的 EHR 数据表示。
J Biomed Inform. 2024 Nov;159:104741. doi: 10.1016/j.jbi.2024.104741. Epub 2024 Oct 29.

本文引用的文献

1
A scoping review of large language model based approaches for information extraction from radiology reports.基于大语言模型从放射学报告中提取信息的方法的范围综述。
NPJ Digit Med. 2024 Aug 24;7(1):222. doi: 10.1038/s41746-024-01219-0.
2
Joint extraction of wheat germplasm information entity relationship based on deep character and word fusion.基于深度字符和词融合的小麦种质资源信息实体关系联合抽取。
Sci Rep. 2024 May 6;14(1):10385. doi: 10.1038/s41598-024-59796-9.
3
BERT-PAGG: a Chinese relationship extraction model fusing PAGG and entity location information.
BERT-PAGG:一种融合PAGG与实体位置信息的中文关系抽取模型。
PeerJ Comput Sci. 2023 Jul 17;9:e1470. doi: 10.7717/peerj-cs.1470. eCollection 2023.
4
Joint Entity and Relation Extraction With Set Prediction Networks.基于集合预测网络的联合实体与关系抽取
IEEE Trans Neural Netw Learn Syst. 2024 Sep;35(9):12784-12795. doi: 10.1109/TNNLS.2023.3264735. Epub 2024 Sep 3.
5
Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network.基于多头注意力神经网络的文本实体关系联合抽取模型的构建与应用。
Comput Intell Neurosci. 2022 May 24;2022:1530295. doi: 10.1155/2022/1530295. eCollection 2022.
6
A Survey on Knowledge Graphs: Representation, Acquisition, and Applications.知识图谱综述:表示、获取与应用
IEEE Trans Neural Netw Learn Syst. 2022 Feb;33(2):494-514. doi: 10.1109/TNNLS.2021.3070843. Epub 2022 Feb 3.