• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用无线信号进行独立于个体的情感检测的深度学习框架。

Deep learning framework for subject-independent emotion detection using wireless signals.

作者信息

Khan Ahsan Noor, Ihalage Achintha Avin, Ma Yihan, Liu Baiyang, Liu Yujie, Hao Yang

机构信息

School of Electronic Engineering and Computer Science, Queen Mary University of London, London, United Kingdom.

出版信息

PLoS One. 2021 Feb 3;16(2):e0242946. doi: 10.1371/journal.pone.0242946. eCollection 2021.

DOI:10.1371/journal.pone.0242946
PMID:33534826
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7857608/
Abstract

Emotion states recognition using wireless signals is an emerging area of research that has an impact on neuroscientific studies of human behaviour and well-being monitoring. Currently, standoff emotion detection is mostly reliant on the analysis of facial expressions and/or eye movements acquired from optical or video cameras. Meanwhile, although they have been widely accepted for recognizing human emotions from the multimodal data, machine learning approaches have been mostly restricted to subject dependent analyses which lack of generality. In this paper, we report an experimental study which collects heartbeat and breathing signals of 15 participants from radio frequency (RF) reflections off the body followed by novel noise filtering techniques. We propose a novel deep neural network (DNN) architecture based on the fusion of raw RF data and the processed RF signal for classifying and visualising various emotion states. The proposed model achieves high classification accuracy of 71.67% for independent subjects with 0.71, 0.72 and 0.71 precision, recall and F1-score values respectively. We have compared our results with those obtained from five different classical ML algorithms and it is established that deep learning offers a superior performance even with limited amount of raw RF and post processed time-sequence data. The deep learning model has also been validated by comparing our results with those from ECG signals. Our results indicate that using wireless signals for stand-by emotion state detection is a better alternative to other technologies with high accuracy and have much wider applications in future studies of behavioural sciences.

摘要

利用无线信号进行情绪状态识别是一个新兴的研究领域,对人类行为的神经科学研究和幸福感监测产生影响。目前,远距离情绪检测主要依赖于对从光学或视频摄像机获取的面部表情和/或眼球运动的分析。同时,尽管机器学习方法已被广泛用于从多模态数据中识别人类情绪,但大多局限于依赖个体的分析,缺乏普遍性。在本文中,我们报告了一项实验研究,该研究通过对人体射频(RF)反射收集15名参与者的心跳和呼吸信号,并采用了新颖的噪声过滤技术。我们提出了一种基于原始RF数据和处理后的RF信号融合的新型深度神经网络(DNN)架构,用于对各种情绪状态进行分类和可视化。所提出的模型在独立受试者上实现了71.67%的高分类准确率,精确率、召回率和F1分数值分别为0.71、0.72和0.71。我们将我们的结果与从五种不同的经典机器学习算法获得的结果进行了比较,结果表明,即使在原始RF数据和后处理时间序列数据量有限的情况下,深度学习也具有卓越的性能。通过将我们的结果与心电图信号的结果进行比较,深度学习模型也得到了验证。我们的结果表明,使用无线信号进行待机情绪状态检测是一种比其他技术更好的选择,具有高精度,并且在未来行为科学研究中有更广泛的应用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/6b748d80fabc/pone.0242946.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/af0ca73c9b52/pone.0242946.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/a0a9e2de8dc1/pone.0242946.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/ce92eebb469b/pone.0242946.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/8f187f9e177d/pone.0242946.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/402900127ce0/pone.0242946.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/aaa721031df4/pone.0242946.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/6b748d80fabc/pone.0242946.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/af0ca73c9b52/pone.0242946.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/a0a9e2de8dc1/pone.0242946.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/ce92eebb469b/pone.0242946.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/8f187f9e177d/pone.0242946.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/402900127ce0/pone.0242946.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/aaa721031df4/pone.0242946.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7086/7857608/6b748d80fabc/pone.0242946.g007.jpg

相似文献

1
Deep learning framework for subject-independent emotion detection using wireless signals.使用无线信号进行独立于个体的情感检测的深度学习框架。
PLoS One. 2021 Feb 3;16(2):e0242946. doi: 10.1371/journal.pone.0242946. eCollection 2021.
2
FusionSense: Emotion Classification Using Feature Fusion of Multimodal Data and Deep Learning in a Brain-Inspired Spiking Neural Network.FusionSense:基于脑启发的尖峰神经网络的多模态数据特征融合和深度学习的情感分类。
Sensors (Basel). 2020 Sep 17;20(18):5328. doi: 10.3390/s20185328.
3
Wireless Sensing Technology Combined with Facial Expression to Realize Multimodal Emotion Recognition.无线感知技术与面部表情相结合,实现多模态情绪识别。
Sensors (Basel). 2022 Dec 28;23(1):338. doi: 10.3390/s23010338.
4
Emotion Classification Based on Pulsatile Images Extracted from Short Facial Videos via Deep Learning.基于深度学习从短面部视频中提取的脉动图像进行情感分类
Sensors (Basel). 2024 Apr 19;24(8):2620. doi: 10.3390/s24082620.
5
A Deep-Learning Model for Subject-Independent Human Emotion Recognition Using Electrodermal Activity Sensors.基于皮肤电活动传感器的主体无关的人类情感识别深度学习模型。
Sensors (Basel). 2019 Apr 7;19(7):1659. doi: 10.3390/s19071659.
6
The Design of CNN Architectures for Optimal Six Basic Emotion Classification Using Multiple Physiological Signals.基于多种生理信号的最优六基本情绪分类的 CNN 架构设计。
Sensors (Basel). 2020 Feb 6;20(3):866. doi: 10.3390/s20030866.
7
Respiration Based Non-Invasive Approach for Emotion Recognition Using Impulse Radio Ultra Wide Band Radar and Machine Learning.基于呼吸的非侵入式情绪识别方法,使用脉冲无线电超宽带雷达和机器学习。
Sensors (Basel). 2021 Dec 13;21(24):8336. doi: 10.3390/s21248336.
8
DRER: Deep Learning-Based Driver's Real Emotion Recognizer.DRER:基于深度学习的驾驶员真实情感识别器。
Sensors (Basel). 2021 Mar 19;21(6):2166. doi: 10.3390/s21062166.
9
EEG-Based Multi-Modal Emotion Recognition using Bag of Deep Features: An Optimal Feature Selection Approach.基于 EEG 的多模态情绪识别的深度特征袋:一种最优特征选择方法。
Sensors (Basel). 2019 Nov 28;19(23):5218. doi: 10.3390/s19235218.
10
Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition.研究基于预训练卷积神经网络的跨被试和跨数据集 EEG 情绪识别
Sensors (Basel). 2020 Apr 4;20(7):2034. doi: 10.3390/s20072034.

引用本文的文献

1
In-silico simultaneous respiratory and circulatory measurement during voluntary breathing, exercise, and mental stress: A computational approach.在自愿呼吸、运动和精神压力期间进行的计算机模拟同步呼吸和循环测量:一种计算方法。
PLoS Comput Biol. 2024 Dec 17;20(12):e1012645. doi: 10.1371/journal.pcbi.1012645. eCollection 2024 Dec.
2
RF sensing enabled tracking of human facial expressions using machine learning algorithms.基于机器学习算法的射频感应人脸识别追踪。
Sci Rep. 2024 Nov 13;14(1):27800. doi: 10.1038/s41598-024-75909-w.
3
Artificial intelligence assists precision medicine in cancer treatment.

本文引用的文献

1
A deep learning model to predict RNA-Seq expression of tumours from whole slide images.从全切片图像预测肿瘤 RNA-Seq 表达的深度学习模型。
Nat Commun. 2020 Aug 3;11(1):3877. doi: 10.1038/s41467-020-17678-4.
2
Deep learning models in genomics; are we there yet?基因组学中的深度学习模型;我们做到了吗?
Comput Struct Biotechnol J. 2020 Jun 17;18:1466-1473. doi: 10.1016/j.csbj.2020.06.017. eCollection 2020.
3
Emotion schemas are embedded in the human visual system.情绪图式嵌入在人类视觉系统中。
人工智能助力癌症治疗中的精准医疗。
Front Oncol. 2023 Jan 4;12:998222. doi: 10.3389/fonc.2022.998222. eCollection 2022.
4
Wireless Sensing Technology Combined with Facial Expression to Realize Multimodal Emotion Recognition.无线感知技术与面部表情相结合,实现多模态情绪识别。
Sensors (Basel). 2022 Dec 28;23(1):338. doi: 10.3390/s23010338.
5
Radar-based remote physiological sensing: Progress, challenges, and opportunities.基于雷达的远程生理传感:进展、挑战与机遇。
Front Physiol. 2022 Oct 11;13:955208. doi: 10.3389/fphys.2022.955208. eCollection 2022.
6
Evaluating Ensemble Learning Methods for Multi-Modal Emotion Recognition Using Sensor Data Fusion.基于传感器数据融合的多模态情感识别的集成学习方法评估。
Sensors (Basel). 2022 Jul 27;22(15):5611. doi: 10.3390/s22155611.
7
Deep learning for behaviour classification in a preclinical brain injury model.深度学习在临床前脑损伤模型中的行为分类。
PLoS One. 2022 Jun 15;17(6):e0268962. doi: 10.1371/journal.pone.0268962. eCollection 2022.
8
A review on machine learning and deep learning for various antenna design applications.关于机器学习和深度学习在各种天线设计应用中的综述。
Heliyon. 2022 Apr 22;8(4):e09317. doi: 10.1016/j.heliyon.2022.e09317. eCollection 2022 Apr.
9
End-to-End Depression Recognition Based on a One-Dimensional Convolution Neural Network Model Using Two-Lead ECG Signal.基于双导联心电图信号的一维卷积神经网络模型的端到端抑郁症识别
J Med Biol Eng. 2022;42(2):225-233. doi: 10.1007/s40846-022-00687-7. Epub 2022 Feb 7.
Sci Adv. 2019 Jul 24;5(7):eaaw4358. doi: 10.1126/sciadv.aaw4358. eCollection 2019 Jul.
4
A Deep Neural Network Model using Random Forest to Extract Feature Representation for Gene Expression Data Classification.基于随机森林的深度神经网络模型在基因表达数据分类中的特征提取。
Sci Rep. 2018 Nov 7;8(1):16477. doi: 10.1038/s41598-018-34833-6.
5
Wearable Health Devices-Vital Sign Monitoring, Systems and Technologies.可穿戴健康设备-生命体征监测、系统和技术。
Sensors (Basel). 2018 Jul 25;18(8):2414. doi: 10.3390/s18082414.
6
A Review of Emotion Recognition Using Physiological Signals.基于生理信号的情感识别研究综述。
Sensors (Basel). 2018 Jun 28;18(7):2074. doi: 10.3390/s18072074.
7
DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals From Wireless Low-cost Off-the-Shelf Devices.DREAMER:一个通过无线低成本现成设备的 EEG 和 ECG 信号进行情感识别的数据库。
IEEE J Biomed Health Inform. 2018 Jan;22(1):98-107. doi: 10.1109/JBHI.2017.2688239. Epub 2017 Mar 27.
8
Gene expression inference with deep learning.基于深度学习的基因表达推断
Bioinformatics. 2016 Jun 15;32(12):1832-9. doi: 10.1093/bioinformatics/btw074. Epub 2016 Feb 11.
9
All-IP wireless sensor networks for real-time patient monitoring.用于实时患者监测的全IP无线传感器网络。
J Biomed Inform. 2014 Dec;52:406-17. doi: 10.1016/j.jbi.2014.08.002. Epub 2014 Aug 19.
10
Bodily maps of emotions.身体的情感图谱。
Proc Natl Acad Sci U S A. 2014 Jan 14;111(2):646-51. doi: 10.1073/pnas.1321664111. Epub 2013 Dec 30.