Suppr超能文献

基于语义引导的注意力机制和自适应门控机制的文档级关系抽取。

Semantic-guided attention and adaptive gating for document-level relation extraction.

机构信息

Department of Intelligent Culture and Tourism, The Open University of Henan, Zhengzhou, 450046, China.

Department of Information Engineering, The Open University of Henan, Zhengzhou, 450046, China.

出版信息

Sci Rep. 2024 Nov 4;14(1):26628. doi: 10.1038/s41598-024-78051-9.

Abstract

In natural language processing, document-level relation extraction is a complex task that aims to predict the relationships among entities by capturing contextual interactions from an unstructured document. Existing graph- and transformer-based models capture long-range relational facts across sentences. However, they still cannot fully exploit the semantic information from multiple interactive sentences, resulting in the exclusion of influential sentences for related entities. To address this problem, a novel Semantic-guided Attention and Adaptively Gated (SAAG) model is developed for document-level relation extraction. First, a semantic-guided attention module is designed to guide sentence representation by assigning different attention scores to different words. The multihead attention mechanism is then used to capture the attention of different subspaces further to generate a document context representation. Finally, the SAAG model exploits the semantic information by leveraging a gating mechanism that can dynamically distinguish between local and global contexts. The experimental results demonstrate that the SAAG model outperforms previous models on two public datasets.

摘要

在自然语言处理中,文档级关系抽取是一项复杂的任务,旨在通过从非结构化文档中捕获上下文交互来预测实体之间的关系。现有的基于图和基于转换器的模型可以捕获跨句子的长程关系事实。然而,它们仍然不能充分利用来自多个交互句子的语义信息,导致相关实体的有影响力的句子被排除在外。为了解决这个问题,我们开发了一种新颖的语义引导注意力和自适应门控(SAAG)模型,用于文档级关系抽取。首先,设计了一种语义引导注意力模块,通过为不同的单词分配不同的注意力分数来指导句子表示。然后使用多头注意力机制进一步捕获不同子空间的注意力,以生成文档上下文表示。最后,SAAG 模型利用门控机制来利用语义信息,该机制可以动态区分局部上下文和全局上下文。实验结果表明,SAAG 模型在两个公共数据集上优于以前的模型。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ce45/11535381/d2a99ff6adba/41598_2024_78051_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验