Institute of Automation, Chinese Academy of Sciences (CAS), China; University of Chinese Academy of Sciences (UCAS), China.
Institute of Automation, Chinese Academy of Sciences (CAS), China.
Neural Netw. 2018 Dec;108:240-247. doi: 10.1016/j.neunet.2018.08.016. Epub 2018 Aug 29.
Distant supervised relation extraction is an important task in the field of natural language processing. There are two main shortcomings for most state-of-the-art methods. One is that they take all sentences of an entity pair as input, which would result in a large computational cost. But in fact, few of most relevant sentences are enough to recognize the relation of an entity pair. To tackle these problems, we propose a novel hierarchical selective attention network for relation extraction under distant supervision. Our model first selects most relevant sentences by taking coarse sentence-level attention on all sentences of an entity pair and then employs word-level attention to construct sentence representations and fine sentence-level attention to aggregate these sentence representations. Experimental results on a widely used dataset demonstrate that our method performs significantly better than most of existing methods.
远程监督关系抽取是自然语言处理领域的一项重要任务。大多数最先进的方法主要存在两个缺点。一个是它们将实体对的所有句子都作为输入,这将导致很大的计算成本。但实际上,识别实体对的关系,很少有最相关的句子就足够了。为了解决这些问题,我们提出了一种新的远程监督关系抽取分层选择性注意网络。我们的模型首先通过对实体对的所有句子进行粗粒度的句子级注意来选择最相关的句子,然后使用词级注意来构建句子表示,并使用细粒度的句子级注意来聚合这些句子表示。在一个广泛使用的数据集上的实验结果表明,我们的方法明显优于大多数现有的方法。