School of Computer and Information, Hohai University, Nanjing, Jiangsu 211100, China.
Department of Information Science and Technology, Nanjing Normal University Zhongbei College, Nanjing, Jiangsu 210046, China.
Comput Intell Neurosci. 2022 May 24;2022:1530295. doi: 10.1155/2022/1530295. eCollection 2022.
Entity relationship extraction is one of the key areas of information extraction and is an important research content in the field of natural language processing. Based on past research, this paper proposes a combined extraction model based on a multi-headed attention neural network. Based on the BERT training model architecture, this paper extracts textual entities and relations tasks. At the same time, it integrates the naming entity feature, the terminology labeling characteristics, and the training relationship. The multi-attention mechanism and improved neural structures are added to the model to enhance the characteristic extraction capacity of the model. By studying the parameters of the multi-head attention mechanism, it is shown that the optimal parameters of the multi-head attention are = 8, = 16, and the classification effect of the model is the best at this time. After experimental analysis, comparing the traditional text entity relationship extraction model and the multi-head attention neural network joint extraction model, the model entity relationship extraction effect was evaluated from the aspects of comprehensive evaluation index 1, accuracy rate , and system time consumed. Experiments show: First, in the accuracy indicator, Xception performance is best, reaching 87.7%, indicating that the model extraction feature effect is enhanced. Second, with the increase of the number of iterative times, the verification set curve and the training set curve have increased to 96% and 98%, respectively, and the model has a strong generalization ability. Third, the model completes the extraction of all data in the test set in 1005 ms, which is an acceptable speed. Therefore, the model test results in this article are good, with a strong practical value.
实体关系抽取是信息抽取的关键领域之一,也是自然语言处理领域的重要研究内容。基于以往的研究,本文提出了一种基于多头注意力神经网络的联合抽取模型。该模型基于 BERT 训练模型架构,提取文本实体和关系任务。同时,它集成了命名实体特征、术语标记特征和训练关系。在模型中添加了多注意力机制和改进的神经结构,以增强模型的特征提取能力。通过研究多头注意力机制的参数,表明多头注意力的最佳参数为 = 8, = 16,此时模型的分类效果最佳。经过实验分析,将传统的文本实体关系抽取模型与多头注意力神经网络联合抽取模型进行比较,从综合评价指标 1、准确率和系统消耗时间等方面对模型的实体关系抽取效果进行评价。实验表明:首先,在准确率指标方面,Xception 性能最佳,达到 87.7%,表明模型提取特征的效果增强。其次,随着迭代次数的增加,验证集曲线和训练集曲线分别增加到 96%和 98%,模型具有较强的泛化能力。第三,模型在 1005 ms 内完成了测试集中所有数据的提取,这是一个可以接受的速度。因此,本文模型的测试结果良好,具有较强的实用价值。