Suppr超能文献

基于多头注意力神经网络的文本实体关系联合抽取模型的构建与应用。

Construction and Application of Text Entity Relation Joint Extraction Model Based on Multi-Head Attention Neural Network.

机构信息

School of Computer and Information, Hohai University, Nanjing, Jiangsu 211100, China.

Department of Information Science and Technology, Nanjing Normal University Zhongbei College, Nanjing, Jiangsu 210046, China.

出版信息

Comput Intell Neurosci. 2022 May 24;2022:1530295. doi: 10.1155/2022/1530295. eCollection 2022.

Abstract

Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BERT training model architecture, this paper extracts textual entities and relations tasks. At the same time, it integrates the naming entity feature, the terminology labeling characteristics, and the training relationship. The multi-attention mechanism and improved neural structures are added to the model to enhance the characteristic extraction capacity of the model. By studying the parameters of the multi-head attention mechanism, it is shown that the optimal parameters of the multi-head attention are  = 8,  = 16, and the classification effect of the model is the best at this time. After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction effect was evaluated from the aspects of comprehensive evaluation index 1, accuracy rate , and system time consumed. Experiments show: First, in the accuracy indicator, Xception performance is best, reaching 87.7%, indicating that the model extraction feature effect is enhanced. Second, with the increase of the number of iterative times, the verification set curve and the training set curve have increased to 96% and 98%, respectively, and the model has a strong generalization ability. Third, the model completes the extraction of all data in the test set in 1005 ms, which is an acceptable speed. Therefore, the model test results in this article are good, with a strong practical value.

摘要

实体关系抽取是信息抽取的关键领域之一,也是自然语言处理领域的重要研究内容。基于以往的研究,本文提出了一种基于多头注意力神经网络的联合抽取模型。该模型基于 BERT 训练模型架构,提取文本实体和关系任务。同时,它集成了命名实体特征、术语标记特征和训练关系。在模型中添加了多注意力机制和改进的神经结构,以增强模型的特征提取能力。通过研究多头注意力机制的参数,表明多头注意力的最佳参数为  = 8,  = 16,此时模型的分类效果最佳。经过实验分析,将传统的文本实体关系抽取模型与多头注意力神经网络联合抽取模型进行比较,从综合评价指标 1、准确率和系统消耗时间等方面对模型的实体关系抽取效果进行评价。实验表明:首先,在准确率指标方面,Xception 性能最佳,达到 87.7%,表明模型提取特征的效果增强。其次,随着迭代次数的增加,验证集曲线和训练集曲线分别增加到 96%和 98%,模型具有较强的泛化能力。第三,模型在 1005 ms 内完成了测试集中所有数据的提取,这是一个可以接受的速度。因此,本文模型的测试结果良好,具有较强的实用价值。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/884c/9155959/009e6fa1220f/CIN2022-1530295.001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验