Suppr超能文献

CharAs-CBert:基于字符的辅助构建-Bert 句子表示改进情感分类。

CharAs-CBert: Character Assist Construction-Bert Sentence Representation Improving Sentiment Classification.

机构信息

School of Artificial Intelligence, Beijing Normal University, No. 19, Xinjiekouwai St., Haidian District, Beijing 100875, China.

出版信息

Sensors (Basel). 2022 Jul 3;22(13):5024. doi: 10.3390/s22135024.

Abstract

In the process of semantic capture, traditional sentence representation methods tend to lose a lot of global and contextual semantics and ignore the internal structure information of words in sentences. To address these limitations, we propose a sentence representation method for character-assisted construction-Bert (CharAs-CBert) to improve the accuracy of sentiment text classification. First, based on the construction, a more effective construction vector is generated to distinguish the basic morphology of the sentence and reduce the ambiguity of the same word in different sentences. At the same time, it aims to strengthen the representation of salient words and effectively capture contextual semantics. Second, character feature vectors are introduced to explore the internal structure information of sentences and improve the representation ability of local and global semantics. Then, to make the sentence representation have better stability and robustness, character information, word information, and construction vectors are combined and used together for sentence representation. Finally, the evaluation and verification are carried out on various open-source baseline data such as ACL-14 and SemEval 2014 to demonstrate the validity and reliability of sentence representation, namely, the and ACC are 87.54% and 92.88% on ACL14, respectively.

摘要

在语义捕捉过程中,传统的句子表示方法往往会丢失大量的全局和上下文语义,并且忽略句子中单词的内部结构信息。为了解决这些局限性,我们提出了一种基于字符辅助构建的句子表示方法-Bert(CharAs-CBert),以提高情感文本分类的准确性。首先,基于构建,生成更有效的构建向量,以区分句子的基本形态并减少不同句子中相同单词的歧义。同时,旨在增强突出词的表示,并有效地捕捉上下文语义。其次,引入字符特征向量来探索句子的内部结构信息,提高局部和全局语义的表示能力。然后,为了使句子表示具有更好的稳定性和鲁棒性,将字符信息、单词信息和构建向量相结合并一起用于句子表示。最后,在 ACL-14 和 SemEval 2014 等各种开源基准数据上进行评估和验证,以证明句子表示的有效性和可靠性,即在 ACL14 上的和 ACC 分别为 87.54%和 92.88%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a939/9269684/7d9ecfa64cd7/sensors-22-05024-g001.jpg

相似文献

2
A BERT-Based Aspect-Level Sentiment Analysis Algorithm for Cross-Domain Text.
Comput Intell Neurosci. 2022 Jun 27;2022:8726621. doi: 10.1155/2022/8726621. eCollection 2022.
3
An Improved BERT and Syntactic Dependency Representation Model for Sentiment Analysis.
Comput Intell Neurosci. 2022 May 5;2022:5754151. doi: 10.1155/2022/5754151. eCollection 2022.
4
Chinese text classification method based on sentence information enhancement and feature fusion.
Heliyon. 2024 Aug 24;10(17):e36861. doi: 10.1016/j.heliyon.2024.e36861. eCollection 2024 Sep 15.
5
Deep Artificial Neural Networks Reveal a Distributed Cortical Network Encoding Propositional Sentence-Level Meaning.
J Neurosci. 2021 May 5;41(18):4100-4119. doi: 10.1523/JNEUROSCI.1152-20.2021. Epub 2021 Mar 22.
6
Construction and Research on Chinese Semantic Mapping Based on Linguistic Features and Sparse Self-Learning Neural Networks.
Comput Intell Neurosci. 2022 Jun 20;2022:2315802. doi: 10.1155/2022/2315802. eCollection 2022.
7
A Lightweight Sentiment Analysis Framework for a Micro-Intelligent Terminal.
Sensors (Basel). 2023 Jan 9;23(2):741. doi: 10.3390/s23020741.
8
Attention-Emotion-Enhanced Convolutional LSTM for Sentiment Analysis.
IEEE Trans Neural Netw Learn Syst. 2022 Sep;33(9):4332-4345. doi: 10.1109/TNNLS.2021.3056664. Epub 2022 Aug 31.
9
TopicBERT: A Topic-Enhanced Neural Language Model Fine-Tuned for Sentiment Classification.
IEEE Trans Neural Netw Learn Syst. 2023 Jan;34(1):380-393. doi: 10.1109/TNNLS.2021.3094987. Epub 2023 Jan 5.
10
Interactive Dual Attention Network for Text Sentiment Classification.
Comput Intell Neurosci. 2020 Nov 3;2020:8858717. doi: 10.1155/2020/8858717. eCollection 2020.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验