Feng Ao, Liu Tao, Li Xiaojie, Jia Ke, Gao Zhengjie
School of Computer Science, Chengdu University of Information Technology, Chengdu, 610225, China.
School of Electronic Information Engineering, Geely University of China, Chengdu, 641423, China.
Sci Rep. 2024 Oct 9;14(1):23528. doi: 10.1038/s41598-024-74668-y.
Aspect-based sentiment analysis (ABSA) is a challenging task due to the presence of multiple aspect words with different sentiment polarities in a sentence. Recently, pre-trained language models like BERT have been widely used as context encoders in ABSA. Graph neural networks have also been employed to extract syntactic and semantic information from sentence parsing trees, resulting in superior results. However, dependency trees may establish irrelevant dependencies for sentences with irregular syntax and complex structures. Additionally, previous methods have not fully utilized recent developments in pre-trained language models. Therefore, we propose a Dual Syntax aware Graph attention networks with Prompt (DSGP) model to address these issues. Our model utilizes prompt templates to maximize the potential of pre-trained models and masked vector outputs of templates as supplementary aspect feature representations. We also leverage both dependency trees and constituent trees with graph attention networks to extract different types of syntactic information. The dependency tree captures syntactic correlation between words, while the constituent tree provides a high-level formation of the sentence. Finally, the output from the prompt and parsing trees is fused and fed into a standard classifier. Experimental results on four public datasets demonstrate the competitive performance of our model.
基于方面的情感分析(ABSA)是一项具有挑战性的任务,因为句子中存在多个具有不同情感极性的方面词。最近,像BERT这样的预训练语言模型已被广泛用作ABSA中的上下文编码器。图神经网络也已被用于从句子解析树中提取句法和语义信息,从而产生了更好的结果。然而,依存树可能会为句法不规则和结构复杂的句子建立不相关的依存关系。此外,先前的方法尚未充分利用预训练语言模型的最新进展。因此,我们提出了一种带有提示的双句法感知图注意力网络(DSGP)模型来解决这些问题。我们的模型利用提示模板来最大化预训练模型的潜力,并将模板的掩码向量输出作为补充方面特征表示。我们还利用图注意力网络结合依存树和成分树来提取不同类型的句法信息。依存树捕获单词之间的句法相关性,而成分树提供句子的高级结构。最后,将提示和解析树的输出进行融合,并输入到标准分类器中。在四个公共数据集上的实验结果证明了我们模型的竞争力。