• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用迁移学习方法进行文本情感检测。

Textual emotion detection utilizing a transfer learning approach.

作者信息

Hadikhah Mozhdehi Mahsa, Eftekhari Moghadam AmirMasoud

机构信息

Faculty of Computer and Information Technology, Islamic Azad University, Qazvin, Iran.

出版信息

J Supercomput. 2023 Mar 22:1-15. doi: 10.1007/s11227-023-05168-5.

DOI:10.1007/s11227-023-05168-5
PMID:37359334
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10032627/
Abstract

Many attempts have been made to overcome the challenges of automating textual emotion detection using different traditional deep learning models such as LSTM, GRU, and BiLSTM. But the problem with these models is that they need large datasets, massive computing resources, and a lot of time to train. Also, they are prone to forgetting and cannot perform well when applied to small datasets. In this paper, we aim to demonstrate the capability of transfer learning techniques to capture the better contextual meaning of the text and as a result better detection of the emotion represented in the text, even without a large amount of data and training time. To do this, we conduct an experiment utilizing a pre-trained model called EmotionalBERT, which is based on bidirectional encoder representations from transformers (BERT), and we compare its performance to RNN-based models on two benchmark datasets, with a focus on the amount of training data and how it affects the models' performance.

摘要

人们已经进行了许多尝试,以克服使用不同的传统深度学习模型(如长短期记忆网络(LSTM)、门控循环单元(GRU)和双向长短期记忆网络(BiLSTM))实现文本情感检测自动化的挑战。但这些模型的问题在于,它们需要大量数据集、海量计算资源以及大量时间来进行训练。此外,它们容易遗忘,并且在应用于小数据集时表现不佳。在本文中,我们旨在展示迁移学习技术捕捉文本更好上下文含义的能力,从而即使在没有大量数据和训练时间的情况下,也能更好地检测文本中所表达的情感。为此,我们利用一个名为情感BERT(EmotionalBERT)的预训练模型进行了一项实验,该模型基于变换器双向编码器表征(BERT),并且我们在两个基准数据集上,将其性能与基于循环神经网络(RNN)的模型进行比较,重点关注训练数据量及其对模型性能的影响。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/80a5/10032627/3a851cf1f862/11227_2023_5168_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/80a5/10032627/3a851cf1f862/11227_2023_5168_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/80a5/10032627/3a851cf1f862/11227_2023_5168_Fig1_HTML.jpg

相似文献

1
Textual emotion detection utilizing a transfer learning approach.利用迁移学习方法进行文本情感检测。
J Supercomput. 2023 Mar 22:1-15. doi: 10.1007/s11227-023-05168-5.
2
A comparative study on deep learning models for text classification of unstructured medical notes with various levels of class imbalance.深度学习模型在不同类别不平衡程度的非结构化医疗记录文本分类中的对比研究。
BMC Med Res Methodol. 2022 Jul 2;22(1):181. doi: 10.1186/s12874-022-01665-y.
3
Towards Transfer Learning Techniques-BERT, DistilBERT, BERTimbau, and DistilBERTimbau for Automatic Text Classification from Different Languages: A Case Study.面向迁移学习技术——BERT、DistilBERT、BERTimbau 和 DistilBERTimbau 用于来自不同语言的自动文本分类:案例研究。
Sensors (Basel). 2022 Oct 26;22(21):8184. doi: 10.3390/s22218184.
4
Identifying Risk Factors Associated With Lower Back Pain in Electronic Medical Record Free Text: Deep Learning Approach Using Clinical Note Annotations.在电子病历自由文本中识别与下背痛相关的风险因素:使用临床记录注释的深度学习方法
JMIR Med Inform. 2023 Aug 9;11:e45105. doi: 10.2196/45105.
5
Deep-GenMut: Automated genetic mutation classification in oncology: A deep learning comparative study.深度基因变异(Deep-GenMut):肿瘤学中的自动基因突变分类:一项深度学习比较研究。
Heliyon. 2024 May 31;10(11):e32279. doi: 10.1016/j.heliyon.2024.e32279. eCollection 2024 Jun 15.
6
When BERT meets Bilbo: a learning curve analysis of pretrained language model on disease classification.当 BERT 遇见比尔博:预训练语言模型在疾病分类上的学习曲线分析。
BMC Med Inform Decis Mak. 2022 Apr 5;21(Suppl 9):377. doi: 10.1186/s12911-022-01829-2.
7
A BERT based dual-channel explainable text emotion recognition system.基于 BERT 的双通道可解释文本情感识别系统。
Neural Netw. 2022 Jun;150:392-407. doi: 10.1016/j.neunet.2022.03.017. Epub 2022 Mar 18.
8
Modified Bidirectional Encoder Representations From Transformers Extractive Summarization Model for Hospital Information Systems Based on Character-Level Tokens (AlphaBERT): Development and Performance Evaluation.基于字符级令牌的医院信息系统变压器抽取式摘要模型(AlphaBERT)的改进双向编码器表示:开发与性能评估
JMIR Med Inform. 2020 Apr 29;8(4):e17787. doi: 10.2196/17787.
9
Identifying the Perceived Severity of Patient-Generated Telemedical Queries Regarding COVID: Developing and Evaluating a Transfer Learning-Based Solution.识别患者生成的关于新冠病毒的远程医疗查询的感知严重程度:开发和评估基于迁移学习的解决方案。
JMIR Med Inform. 2022 Sep 2;10(9):e37770. doi: 10.2196/37770.
10
Adapting Bidirectional Encoder Representations from Transformers (BERT) to Assess Clinical Semantic Textual Similarity: Algorithm Development and Validation Study.改编来自Transformer的双向编码器表征(BERT)以评估临床语义文本相似性:算法开发与验证研究。
JMIR Med Inform. 2021 Feb 3;9(2):e22795. doi: 10.2196/22795.