• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

ACR-SA:通过双通道卷积神经网络和双向循环神经网络实现的基于注意力的深度情感分析模型

ACR-SA: attention-based deep model through two-channel CNN and Bi-RNN for sentiment analysis.

作者信息

Kamyab Marjan, Liu Guohua, Rasool Abdur, Adjeisah Michael

机构信息

School of Computer Science and Technology, Donghua University, Shanghai, China.

Shenzhen College of Advanced Technology, University of Chinese Academy of Sciences, Beijing, China.

出版信息

PeerJ Comput Sci. 2022 Mar 17;8:e877. doi: 10.7717/peerj-cs.877. eCollection 2022.

DOI:10.7717/peerj-cs.877
PMID:35494855
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9044316/
Abstract

Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RNN) have been successfully applied to Natural Language Processing (NLP), especially in sentiment analysis. NLP can execute numerous functions to achieve significant results through RNN and CNN. Likewise, previous research shows that RNN achieved meaningful results than CNN due to extracting long-term dependencies. Meanwhile, CNN has its advantage; it can extract high-level features using its local fixed-size context at the input level. However, integrating these advantages into one network is challenging because of overfitting in training. Another problem with such models is the consideration of all the features equally. To this end, we propose an attention-based sentiment analysis using CNN and two independent bidirectional RNN networks to address the problems mentioned above and improve sentiment knowledge. Firstly, we apply a preprocessor to enhance the data quality by correcting spelling mistakes and removing noisy content. Secondly, our model utilizes CNN with max-pooling to extract contextual features and reduce feature dimensionality. Thirdly, two independent bidirectional RNN, ., Long Short-Term Memory and Gated Recurrent Unit are used to capture long-term dependencies. We also applied the attention mechanism to the RNN layer output to emphasize each word's attention level. Furthermore, Gaussian Noise and Dropout as regularization are applied to avoid the overfitting problem. Finally, we verify the model's robustness on four standard datasets. Compared with existing improvements on the most recent neural network models, the experiment results show that our model significantly outperformed the state-of-the-art models.

摘要

卷积神经网络(CNN)和循环神经网络(RNN)已成功应用于自然语言处理(NLP),尤其是在情感分析方面。NLP可以通过RNN和CNN执行众多功能以取得显著成果。同样,先前的研究表明,由于能够提取长期依赖关系,RNN比CNN取得了更有意义的结果。同时,CNN也有其优势;它可以在输入层使用局部固定大小的上下文来提取高级特征。然而,由于训练中的过拟合,将这些优势集成到一个网络中具有挑战性。此类模型的另一个问题是平等考虑所有特征。为此,我们提出一种基于注意力的情感分析方法,使用CNN和两个独立的双向RNN网络来解决上述问题并提升情感知识。首先,我们应用一个预处理器来通过纠正拼写错误和去除噪声内容来提高数据质量。其次,我们的模型利用带有最大池化的CNN来提取上下文特征并降低特征维度。第三,使用两个独立的双向RNN,即长短期记忆网络和门控循环单元来捕获长期依赖关系。我们还将注意力机制应用于RNN层输出,以强调每个单词的注意力水平。此外,应用高斯噪声和随机失活作为正则化来避免过拟合问题。最后,我们在四个标准数据集上验证了该模型的鲁棒性。与对最新神经网络模型的现有改进相比,实验结果表明我们的模型显著优于当前的最优模型。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/af099279482d/peerj-cs-08-877-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/f6fe36f231e6/peerj-cs-08-877-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/9a55ad72d907/peerj-cs-08-877-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/64058f800c20/peerj-cs-08-877-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/5f19697507e7/peerj-cs-08-877-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/af099279482d/peerj-cs-08-877-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/f6fe36f231e6/peerj-cs-08-877-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/9a55ad72d907/peerj-cs-08-877-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/64058f800c20/peerj-cs-08-877-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/5f19697507e7/peerj-cs-08-877-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6c21/9044316/af099279482d/peerj-cs-08-877-g005.jpg

相似文献

1
ACR-SA: attention-based deep model through two-channel CNN and Bi-RNN for sentiment analysis.ACR-SA:通过双通道卷积神经网络和双向循环神经网络实现的基于注意力的深度情感分析模型
PeerJ Comput Sci. 2022 Mar 17;8:e877. doi: 10.7717/peerj-cs.877. eCollection 2022.
2
Multichannel Two-Dimensional Convolutional Neural Network Based on Interactive Features and Group Strategy for Chinese Sentiment Analysis.基于交互特征和分组策略的多通道二维卷积神经网络的中文情感分析。
Sensors (Basel). 2022 Jan 18;22(3):714. doi: 10.3390/s22030714.
3
Parallel Structure Deep Neural Network Using CNN and RNN with an Attention Mechanism for Breast Cancer Histology Image Classification.基于卷积神经网络(CNN)和循环神经网络(RNN)并带有注意力机制的并行结构深度神经网络用于乳腺癌组织学图像分类
Cancers (Basel). 2019 Nov 29;11(12):1901. doi: 10.3390/cancers11121901.
4
Comparing deep learning architectures for sentiment analysis on drug reviews.比较药物评论情感分析的深度学习架构。
J Biomed Inform. 2020 Oct;110:103539. doi: 10.1016/j.jbi.2020.103539. Epub 2020 Aug 17.
5
DC-CNN: Dual-channel Convolutional Neural Networks with attention-pooling for fake news detection.DC-CNN:用于虚假新闻检测的带注意力池化的双通道卷积神经网络。
Appl Intell (Dordr). 2023;53(7):8354-8369. doi: 10.1007/s10489-022-03910-9. Epub 2022 Aug 1.
6
Temporal indexing of medical entity in Chinese clinical notes.中文临床记录中医疗实体的时间索引。
BMC Med Inform Decis Mak. 2019 Jan 31;19(Suppl 1):17. doi: 10.1186/s12911-019-0735-x.
7
Automated AJCC (7th edition) staging of non-small cell lung cancer (NSCLC) using deep convolutional neural network (CNN) and recurrent neural network (RNN).使用深度卷积神经网络(CNN)和循环神经网络(RNN)对非小细胞肺癌(NSCLC)进行自动AJCC(第7版)分期
Health Inf Sci Syst. 2019 Jul 30;7(1):14. doi: 10.1007/s13755-019-0077-1. eCollection 2019 Dec.
8
Character gated recurrent neural networks for Arabic sentiment analysis.基于字符门控循环神经网络的阿拉伯语情感分析。
Sci Rep. 2022 Jun 13;12(1):9779. doi: 10.1038/s41598-022-13153-w.
9
Position-Wise Gated Res2Net-Based Convolutional Network with Selective Fusing for Sentiment Analysis.基于位置门控Res2Net并带有选择性融合的卷积网络用于情感分析
Entropy (Basel). 2023 Apr 30;25(5):740. doi: 10.3390/e25050740.
10
Intelligent diagnosis with Chinese electronic medical records based on convolutional neural networks.基于卷积神经网络的中文电子病历智能诊断。
BMC Bioinformatics. 2019 Feb 1;20(1):62. doi: 10.1186/s12859-019-2617-8.

引用本文的文献

1
Mining Suicidal Ideation in Chinese Social Media: A Dual-Channel Deep Learning Model with Information Gain Optimization.挖掘中国社交媒体中的自杀意念:一种具有信息增益优化的双通道深度学习模型。
Entropy (Basel). 2025 Jan 24;27(2):116. doi: 10.3390/e27020116.
2
Unboxing Deep Learning Model of Food Delivery Service Reviews Using Explainable Artificial Intelligence (XAI) Technique.使用可解释人工智能(XAI)技术剖析食品配送服务评论的深度学习模型
Foods. 2022 Jul 8;11(14):2019. doi: 10.3390/foods11142019.

本文引用的文献

1
Event Detection System Based on User Behavior Changes in Online Social Networks: Case of the COVID-19 Pandemic.基于在线社交网络中用户行为变化的事件检测系统:以新冠疫情为例
IEEE Access. 2020 Aug 31;8:158806-158825. doi: 10.1109/ACCESS.2020.3020391. eCollection 2020.
2
Family history information extraction via deep joint learning.通过深度联合学习提取家族史信息。
BMC Med Inform Decis Mak. 2019 Dec 27;19(Suppl 10):277. doi: 10.1186/s12911-019-0995-5.
3
Label-less Learning for Emotion Cognition.无标签学习在情绪认知中的应用。
IEEE Trans Neural Netw Learn Syst. 2020 Jul;31(7):2430-2440. doi: 10.1109/TNNLS.2019.2929071. Epub 2019 Aug 13.
4
Long short-term memory.长短期记忆
Neural Comput. 1997 Nov 15;9(8):1735-80. doi: 10.1162/neco.1997.9.8.1735.