Suppr超能文献

ACR-SA:通过双通道卷积神经网络和双向循环神经网络实现的基于注意力的深度情感分析模型

ACR-SA: attention-based deep model through two-channel CNN and Bi-RNN for sentiment analysis.

作者信息

Kamyab Marjan, Liu Guohua, Rasool Abdur, Adjeisah Michael

机构信息

School of Computer Science and Technology, Donghua University, Shanghai, China.

Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Beijing, China.

出版信息

PeerJ Comput Sci. 2022 Mar 17;8:e877. doi: 10.7717/peerj-cs.877. eCollection 2022.

Abstract

Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) have been successfully applied to Natural Language Processing (NLP), especially in sentiment analysis. NLP can execute numerous functions to achieve significant results through RNN and CNN. Likewise, previous research shows that RNN achieved meaningful results than CNN due to extracting long-term dependencies. Meanwhile, CNN has its advantage; it can extract high-level features using its local fixed-size context at the input level. However, integrating these advantages into one network is challenging because of overfitting in training. Another problem with such models is the consideration of all the features equally. To this end, we propose an attention-based sentiment analysis using CNN and two independent bidirectional RNN networks to address the problems mentioned above and improve sentiment knowledge. Firstly, we apply a preprocessor to enhance the data quality by correcting spelling mistakes and removing noisy content. Secondly, our model utilizes CNN with max-pooling to extract contextual features and reduce feature dimensionality. Thirdly, two independent bidirectional RNN, ., Long Short-Term Memory and Gated Recurrent Unit are used to capture long-term dependencies. We also applied the attention mechanism to the RNN layer output to emphasize each word's attention level. Furthermore, Gaussian Noise and Dropout as regularization are applied to avoid the overfitting problem. Finally, we verify the model's robustness on four standard datasets. Compared with existing improvements on the most recent neural network models, the experiment results show that our model significantly outperformed the state-of-the-art models.

摘要

卷积神经网络(CNN)和循环神经网络(RNN)已成功应用于自然语言处理(NLP),尤其是在情感分析方面。NLP可以通过RNN和CNN执行众多功能以取得显著成果。同样,先前的研究表明,由于能够提取长期依赖关系,RNN比CNN取得了更有意义的结果。同时,CNN也有其优势;它可以在输入层使用局部固定大小的上下文来提取高级特征。然而,由于训练中的过拟合,将这些优势集成到一个网络中具有挑战性。此类模型的另一个问题是平等考虑所有特征。为此,我们提出一种基于注意力的情感分析方法,使用CNN和两个独立的双向RNN网络来解决上述问题并提升情感知识。首先,我们应用一个预处理器来通过纠正拼写错误和去除噪声内容来提高数据质量。其次,我们的模型利用带有最大池化的CNN来提取上下文特征并降低特征维度。第三,使用两个独立的双向RNN,即长短期记忆网络和门控循环单元来捕获长期依赖关系。我们还将注意力机制应用于RNN层输出,以强调每个单词的注意力水平。此外,应用高斯噪声和随机失活作为正则化来避免过拟合问题。最后,我们在四个标准数据集上验证了该模型的鲁棒性。与对最新神经网络模型的现有改进相比,实验结果表明我们的模型显著优于当前的最优模型。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/f6fe36f231e6/peerj-cs-08-877-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验