Information System and Security & Countermeasures Experimental Center, Beijing Institute of Technology, Beijing 100081, China.
Information System and Security & Countermeasures Experimental Center, Beijing Institute of Technology, Beijing 100081, China.
Neural Netw. 2021 Oct;142:213-220. doi: 10.1016/j.neunet.2021.04.032. Epub 2021 Apr 28.
Distant supervision relation extraction methods are widely used to extract relational facts in text. The traditional selective attention model regards instances in the bag as independent of each other, which makes insufficient use of correlation information between instances and supervision information of all correctly labeled instances, affecting the performance of relation extractor. Aiming at this problem, a distant supervision relation extraction method with self-selective attention is proposed. The method uses a layer of convolution and self-attention mechanism to encode instances to learn the better semantic vector representation of instances. The correlation between instances in the bag is used to assign a higher weight to all correctly labeled instances, and the weighted summation of instances in the bag is used to obtain a bag vector representation. Experiments on the NYT dataset show that the method can make full use of the information of all correctly labeled instances in the bag. The method can achieve better results as compared with baselines.
远程监督关系抽取方法被广泛用于从文本中抽取关系事实。传统的选择性注意模型将袋子中的实例视为相互独立的,这使得实例之间的相关性信息和所有正确标记实例的监督信息未得到充分利用,影响了关系抽取器的性能。针对这一问题,提出了一种具有自选择性注意的远程监督关系抽取方法。该方法使用一层卷积和自注意机制对实例进行编码,以学习实例更好的语义向量表示。利用袋子中实例之间的相关性,对所有正确标记的实例赋予更高的权重,并对袋子中的实例进行加权求和,得到袋子向量表示。在 NYT 数据集上的实验表明,该方法可以充分利用袋子中所有正确标记实例的信息。与基线相比,该方法可以取得更好的效果。