• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于传感器数据融合的多模态情感识别的集成学习方法评估。

Evaluating Ensemble Learning Methods for Multi-Modal Emotion Recognition Using Sensor Data Fusion.

机构信息

Faculty of Computers and Information Minia University, Minia 61519, Egypt.

Faculty of Computers and Information Minia University, Al-Obour High Institute for Management, Computers and Information systems, Obour, Cairo 999060, Egypt.

出版信息

Sensors (Basel). 2022 Jul 27;22(15):5611. doi: 10.3390/s22155611.

DOI:10.3390/s22155611
PMID:35957167
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9371233/
Abstract

Automatic recognition of human emotions is not a trivial process. There are many factors affecting emotions internally and externally. Expressing emotions could also be performed in many ways such as text, speech, body gestures or even physiologically by physiological body responses. Emotion detection enables many applications such as adaptive user interfaces, interactive games, and human robot interaction and many more. The availability of advanced technologies such as mobiles, sensors, and data analytics tools led to the ability to collect data from various sources, which enabled researchers to predict human emotions accurately. Most current research uses them in the lab experiments for data collection. In this work, we use direct and real time sensor data to construct a subject-independent (generic) multi-modal emotion prediction model. This research integrates both on-body physiological markers, surrounding sensory data, and emotion measurements to achieve the following goals: (1) Collecting a multi-modal data set including environmental, body responses, and emotions. (2) Creating subject-independent Predictive models of emotional states based on fusing environmental and physiological variables. (3) Assessing ensemble learning methods and comparing their performance for creating a generic subject-independent model for emotion recognition with high accuracy and comparing the results with previous similar research. To achieve that, we conducted a real-world study "in the wild" with physiological and mobile sensors. Collecting the data-set is coming from participants walking around Minia university campus to create accurate predictive models. Various ensemble learning models (Bagging, Boosting, and Stacking) have been used, combining the following base algorithms (K Nearest Neighbor KNN, Decision Tree DT, Random Forest RF, and Support Vector Machine SVM) as base learners and DT as a meta-classifier. The results showed that, the ensemble stacking learner technique gave the best accuracy of 98.2% compared with other variants of ensemble learning methods. On the contrary, bagging and boosting methods gave (96.4%) and (96.6%) accuracy levels respectively.

摘要

人类情感的自动识别不是一个简单的过程。有许多内部和外部因素会影响情感。情感表达也可以通过多种方式进行,例如文本、语音、身体姿势,甚至通过生理身体反应。情感检测可以实现许多应用,例如自适应用户界面、互动游戏、人机交互等。移动设备、传感器和数据分析工具等先进技术的可用性使得能够从各种来源收集数据,这使研究人员能够准确预测人类情感。目前大多数研究都在实验室实验中使用这些技术来收集数据。在这项工作中,我们使用直接和实时传感器数据来构建一个与主体无关(通用)的多模态情感预测模型。这项研究集成了身体上的生理标记、周围的感觉数据和情感测量,以实现以下目标:(1)收集包括环境、身体反应和情感的多模态数据集。(2)基于融合环境和生理变量,创建独立于主体的情感状态预测模型。(3)评估集成学习方法,并比较它们的性能,以创建具有高精度的通用独立主体情感识别模型,并将结果与之前的类似研究进行比较。为了实现这一目标,我们使用生理和移动传感器进行了现实世界的“野外”研究。数据集的收集来自于参与者在米尼亚大学校园周围散步的数据,以创建准确的预测模型。使用了各种集成学习模型(Bagging、Boosting 和 Stacking),将以下基础算法(K 近邻 KNN、决策树 DT、随机森林 RF 和支持向量机 SVM)结合作为基础学习者,将 DT 作为元分类器。结果表明,与其他集成学习方法的变体相比,集成堆叠学习技术的准确率最高,达到 98.2%。相反,Bagging 和 Boosting 方法的准确率分别为 96.4%和 96.6%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/2037f8a6d6fe/sensors-22-05611-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/7e716cac21a7/sensors-22-05611-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/dd7a4206561b/sensors-22-05611-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/a818fcc2ef97/sensors-22-05611-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/c40a4d11069b/sensors-22-05611-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/916e7cdb9a72/sensors-22-05611-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/a74adeba5e89/sensors-22-05611-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/10ced5c6c1e2/sensors-22-05611-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/b62c25d077c5/sensors-22-05611-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/65277a5cc28e/sensors-22-05611-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/3f89fad62c0a/sensors-22-05611-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/bdd5174c4e00/sensors-22-05611-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/01da6543d5d1/sensors-22-05611-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/d47ea432daed/sensors-22-05611-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/471f9f3fe1ae/sensors-22-05611-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/22af63c9a4f1/sensors-22-05611-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/f14a54f7d33d/sensors-22-05611-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/2037f8a6d6fe/sensors-22-05611-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/7e716cac21a7/sensors-22-05611-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/dd7a4206561b/sensors-22-05611-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/a818fcc2ef97/sensors-22-05611-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/c40a4d11069b/sensors-22-05611-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/916e7cdb9a72/sensors-22-05611-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/a74adeba5e89/sensors-22-05611-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/10ced5c6c1e2/sensors-22-05611-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/b62c25d077c5/sensors-22-05611-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/65277a5cc28e/sensors-22-05611-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/3f89fad62c0a/sensors-22-05611-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/bdd5174c4e00/sensors-22-05611-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/01da6543d5d1/sensors-22-05611-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/d47ea432daed/sensors-22-05611-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/471f9f3fe1ae/sensors-22-05611-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/22af63c9a4f1/sensors-22-05611-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/f14a54f7d33d/sensors-22-05611-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2f0e/9371233/2037f8a6d6fe/sensors-22-05611-g017.jpg

相似文献

1
Evaluating Ensemble Learning Methods for Multi-Modal Emotion Recognition Using Sensor Data Fusion.基于传感器数据融合的多模态情感识别的集成学习方法评估。
Sensors (Basel). 2022 Jul 27;22(15):5611. doi: 10.3390/s22155611.
2
An Ensemble Learning Approach for Electrocardiogram Sensor Based Human Emotion Recognition.基于心电图传感器的人体情绪识别的集成学习方法。
Sensors (Basel). 2019 Oct 16;19(20):4495. doi: 10.3390/s19204495.
3
A novel speech emotion recognition method based on feature construction and ensemble learning.基于特征构建和集成学习的新型语音情感识别方法。
PLoS One. 2022 Aug 15;17(8):e0267132. doi: 10.1371/journal.pone.0267132. eCollection 2022.
4
EEG rhythm based emotion recognition using multivariate decomposition and ensemble machine learning classifier.基于 EEG 节律的多变量分解和集成机器学习分类器的情绪识别。
J Neurosci Methods. 2023 Jun 1;393:109879. doi: 10.1016/j.jneumeth.2023.109879. Epub 2023 May 12.
5
EEG-Based Emotion Classification Using Stacking Ensemble Approach.基于 EEG 的情绪分类的堆叠集成方法。
Sensors (Basel). 2022 Nov 6;22(21):8550. doi: 10.3390/s22218550.
6
A Comparison of Machine Learning Algorithms and Feature Sets for Automatic Vocal Emotion Recognition in Speech.机器学习算法和特征集在语音自动情感识别中的比较
Sensors (Basel). 2022 Oct 6;22(19):7561. doi: 10.3390/s22197561.
7
Recognition of Emotion Intensities Using Machine Learning Algorithms: A Comparative Study.基于机器学习算法的情感强度识别:一项比较研究。
Sensors (Basel). 2019 Apr 21;19(8):1897. doi: 10.3390/s19081897.
8
Multi-modal emotion recognition using EEG and speech signals.基于脑电和语音信号的多模态情感识别。
Comput Biol Med. 2022 Oct;149:105907. doi: 10.1016/j.compbiomed.2022.105907. Epub 2022 Jul 22.
9
Reward-Penalty Weighted Ensemble for Emotion State Classification from Multi-Modal Data Streams.基于多模态数据流的奖励-惩罚加权集成的情绪状态分类。
Int J Neural Syst. 2022 Dec;32(12):2250049. doi: 10.1142/S0129065722500496. Epub 2022 Sep 21.
10
The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study.连贯机器人行为与具身性在人机交互中情绪感知与识别中的作用:实验研究
JMIR Hum Factors. 2024 Jan 26;11:e45494. doi: 10.2196/45494.

引用本文的文献

1
Home Robot Interaction Based on EEG Motor Imagery and Visual Perception Fusion.基于脑电运动想象与视觉感知融合的家庭机器人交互
Sensors (Basel). 2025 Sep 6;25(17):5568. doi: 10.3390/s25175568.
2
Multimodal Sensing-Enabled Large Language Models for Automated Emotional Regulation: A Review of Current Technologies, Opportunities, and Challenges.用于自动情绪调节的多模态传感大语言模型:当前技术、机遇与挑战综述
Sensors (Basel). 2025 Aug 1;25(15):4763. doi: 10.3390/s25154763.
3
Enhancing patient rehabilitation outcomes: artificial intelligence-driven predictive modeling for home discharge in neurological and orthopedic conditions.

本文引用的文献

1
Deep learning framework for subject-independent emotion detection using wireless signals.使用无线信号进行独立于个体的情感检测的深度学习框架。
PLoS One. 2021 Feb 3;16(2):e0242946. doi: 10.1371/journal.pone.0242946. eCollection 2021.
2
Using Machine Learning and Smartphone and Smartwatch Data to Detect Emotional States and Transitions: Exploratory Study.利用机器学习和智能手机与智能手表数据来检测情绪状态和转变:探索性研究。
JMIR Mhealth Uhealth. 2020 Sep 29;8(9):e17818. doi: 10.2196/17818.
3
Wearable Emotion Recognition Using Heart Rate Data from a Smart Bracelet.
提高患者康复效果:针对神经科和骨科疾病出院居家情况的人工智能驱动预测模型
J Neuroeng Rehabil. 2025 May 26;22(1):117. doi: 10.1186/s12984-025-01654-4.
4
Enhancing Situational Awareness with VAS-Compass Net for the Recognition of Directional Vehicle Alert Sounds.利用 VAS-Compass Net 增强情境感知,以识别定向车辆警报声音。
Sensors (Basel). 2024 Oct 24;24(21):6841. doi: 10.3390/s24216841.
5
Spectrum Evaluation in CR-Based Smart Healthcare Systems Using Optimizable Tree Machine Learning Approach.基于可优化决策树机器学习方法的 CR 智能医疗系统中的谱评估
Sensors (Basel). 2023 Aug 27;23(17):7456. doi: 10.3390/s23177456.
使用智能手环的心率数据进行可穿戴情感识别。
Sensors (Basel). 2020 Jan 28;20(3):718. doi: 10.3390/s20030718.
4
Human Emotion Recognition: Review of Sensors and Methods.人类情感识别:传感器与方法综述。
Sensors (Basel). 2020 Jan 21;20(3):592. doi: 10.3390/s20030592.
5
Improving methodology in heart rate variability analysis for the premature infants: Impact of the time length.提高早产儿心率变异性分析方法的研究:时间长度的影响。
PLoS One. 2019 Aug 9;14(8):e0220692. doi: 10.1371/journal.pone.0220692. eCollection 2019.
6
A Globally Generalized Emotion Recognition System Involving Different Physiological Signals.涉及不同生理信号的全球化广义情绪识别系统。
Sensors (Basel). 2018 Jun 11;18(6):1905. doi: 10.3390/s18061905.
7
A simple algorithm for emotion recognition, using physiological signals of a smart watch.一种利用智能手表生理信号进行情感识别的简单算法。
Annu Int Conf IEEE Eng Med Biol Soc. 2017 Jul;2017:2353-2356. doi: 10.1109/EMBC.2017.8037328.
8
Emotion recognition using Kinect motion capture data of human gaits.利用人体步态的Kinect动作捕捉数据进行情绪识别。
PeerJ. 2016 Sep 15;4:e2364. doi: 10.7717/peerj.2364. eCollection 2016.
9
Emotion recognition based on customized smart bracelet with built-in accelerometer.基于内置加速度计的定制智能手环的情绪识别
PeerJ. 2016 Jul 26;4:e2258. doi: 10.7717/peerj.2258. eCollection 2016.
10
A Comparison of Physiological Signal Analysis Techniques and Classifiers for Automatic Emotional Evaluation of Audiovisual Contents.用于视听内容自动情感评估的生理信号分析技术与分类器的比较
Front Comput Neurosci. 2016 Jul 15;10:74. doi: 10.3389/fncom.2016.00074. eCollection 2016.