• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用简化的生理信号集对女性进行恐惧识别。

Fear Recognition for Women Using a Reduced Set of Physiological Signals.

作者信息

Miranda Jose A, Canabal Manuel F, Gutiérrez-Martín Laura, Lanza-Gutierrez Jose M, Portela-García Marta, López-Ongil Celia

机构信息

Electronic Technology Department, Universidad Carlos III of Madrid, 28911 Leganés, Madrid, Spain.

Department of Computer Science, University of Alcalá, 28871 Alcalá de Henares, Madrid, Spain.

出版信息

Sensors (Basel). 2021 Feb 25;21(5):1587. doi: 10.3390/s21051587.

DOI:10.3390/s21051587
PMID:33668745
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7956215/
Abstract

Emotion recognition is benefitting from the latest research into physiological monitoring and wireless communications, among other remarkable achievements. These technologies can indeed provide solutions to protect vulnerable people in scenarios such as personal assaults, the abuse of children or the elderly, gender violence or sexual aggression. Cyberphysical systems using smart sensors, artificial intelligence and wearable and inconspicuous devices can serve as bodyguards to detect these risky situations (through fear-related emotion detection) and automatically trigger a protection protocol. As expected, these systems should be trained and customized for each user to ensure the best possible performance, which undoubtedly requires a gender perspective. This paper presents a specialized fear recognition system for women based on a reduced set of physiological signals. The architecture proposed is characterized by the usage of three physiological sensors, lightweight binary classification and the conjunction of linear (temporal and frequency) and non-linear features. Moreover, a binary fear mapping strategy between dimensional and discrete emotional information based on emotional self-report data is implemented to avoid emotional bias. The architecture is evaluated using a public multi-modal physiological dataset with two approaches (subject-dependent and subject-independent models) focusing on the female participants. As a result, the proposal outperforms the state-of-the-art in fear recognition, achieving a recognition rate of up to 96.33% for the subject-dependent model.

摘要

情感识别正受益于对生理监测和无线通信等方面的最新研究以及其他显著成果。这些技术确实能够提供解决方案,以保护处于诸如人身攻击、虐待儿童或老人、性别暴力或性侵犯等场景中的弱势群体。使用智能传感器、人工智能以及可穿戴且不引人注意的设备的信息物理系统可以充当保镖,检测这些危险情况(通过与恐惧相关的情感检测)并自动触发保护协议。不出所料,这些系统应该针对每个用户进行训练和定制,以确保尽可能最佳的性能,这无疑需要从性别角度出发。本文提出了一种基于一组简化生理信号的针对女性的专门恐惧识别系统。所提出的架构的特点是使用三个生理传感器、轻量级二元分类以及线性(时间和频率)与非线性特征的结合。此外,基于情感自我报告数据实施了维度和离散情感信息之间二元恐惧映射策略,以避免情感偏差。使用一个公共多模态生理数据集,通过两种方法(依赖于受试者和独立于受试者的模型)对架构进行评估,重点关注女性参与者。结果,该提议在恐惧识别方面优于现有技术,依赖于受试者的模型实现了高达96.33%的识别率。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/f16c02bd4033/sensors-21-01587-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/8529324650b0/sensors-21-01587-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/9ea981c443c6/sensors-21-01587-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/dfad8426f70b/sensors-21-01587-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/02354b7b93f4/sensors-21-01587-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/36db562fdbfb/sensors-21-01587-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/3c696bc09982/sensors-21-01587-g0A6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/34b4f01b71a5/sensors-21-01587-g0A7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/d0afb906a657/sensors-21-01587-g0A8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/4676947c99a4/sensors-21-01587-g0A9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/99b578d46f48/sensors-21-01587-g0A10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/fad84e92c3dd/sensors-21-01587-g0A11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/324737cbb049/sensors-21-01587-g0A12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/bb7ac18faf23/sensors-21-01587-g0A13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/6915c7ff832f/sensors-21-01587-g0A14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/33967966f46f/sensors-21-01587-g0A15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/2b3ab3aabf0b/sensors-21-01587-g0A16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/3bfe5886c0f1/sensors-21-01587-g0A17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/1440b9ec5082/sensors-21-01587-g0A18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/84492f297e46/sensors-21-01587-g0A19.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/31a5945bdd0d/sensors-21-01587-g0A20.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/b4da41380012/sensors-21-01587-g0A21.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/597fa69f607c/sensors-21-01587-g0A22.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/c94c63c0d15d/sensors-21-01587-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/ef62eb371fad/sensors-21-01587-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/2453c22caff5/sensors-21-01587-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/ca6148fe5892/sensors-21-01587-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/f16c02bd4033/sensors-21-01587-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/8529324650b0/sensors-21-01587-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/9ea981c443c6/sensors-21-01587-g0A2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/dfad8426f70b/sensors-21-01587-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/02354b7b93f4/sensors-21-01587-g0A4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/36db562fdbfb/sensors-21-01587-g0A5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/3c696bc09982/sensors-21-01587-g0A6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/34b4f01b71a5/sensors-21-01587-g0A7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/d0afb906a657/sensors-21-01587-g0A8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/4676947c99a4/sensors-21-01587-g0A9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/99b578d46f48/sensors-21-01587-g0A10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/fad84e92c3dd/sensors-21-01587-g0A11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/324737cbb049/sensors-21-01587-g0A12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/bb7ac18faf23/sensors-21-01587-g0A13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/6915c7ff832f/sensors-21-01587-g0A14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/33967966f46f/sensors-21-01587-g0A15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/2b3ab3aabf0b/sensors-21-01587-g0A16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/3bfe5886c0f1/sensors-21-01587-g0A17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/1440b9ec5082/sensors-21-01587-g0A18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/84492f297e46/sensors-21-01587-g0A19.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/31a5945bdd0d/sensors-21-01587-g0A20.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/b4da41380012/sensors-21-01587-g0A21.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/597fa69f607c/sensors-21-01587-g0A22.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/c94c63c0d15d/sensors-21-01587-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/ef62eb371fad/sensors-21-01587-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/2453c22caff5/sensors-21-01587-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/ca6148fe5892/sensors-21-01587-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/58ed/7956215/f16c02bd4033/sensors-21-01587-g005.jpg

相似文献

1
Fear Recognition for Women Using a Reduced Set of Physiological Signals.利用简化的生理信号集对女性进行恐惧识别。
Sensors (Basel). 2021 Feb 25;21(5):1587. doi: 10.3390/s21051587.
2
CorrNet: Fine-Grained Emotion Recognition for Video Watching Using Wearable Physiological Sensors.CorrNet:使用可穿戴生理传感器进行视频观看的细粒度情绪识别。
Sensors (Basel). 2020 Dec 24;21(1):52. doi: 10.3390/s21010052.
3
Emotion recognition based on physiological changes in music listening.基于音乐聆听中生理变化的情绪识别。
IEEE Trans Pattern Anal Mach Intell. 2008 Dec;30(12):2067-83. doi: 10.1109/TPAMI.2008.26.
4
Fear Detection in Multimodal Affective Computing: Physiological Signals versus Catecholamine Concentration.多模态情感计算中的恐惧检测:生理信号与儿茶酚胺浓度。
Sensors (Basel). 2022 May 26;22(11):4023. doi: 10.3390/s22114023.
5
Development and Progress in Sensors and Technologies for Human Emotion Recognition.人类情感识别传感器和技术的发展与进步。
Sensors (Basel). 2021 Aug 18;21(16):5554. doi: 10.3390/s21165554.
6
Emotion Elicitation Under Audiovisual Stimuli Reception: Should Artificial Intelligence Consider the Gender Perspective?视听刺激接收下的情绪诱发:人工智能是否应该考虑性别视角?
Int J Environ Res Public Health. 2020 Nov 17;17(22):8534. doi: 10.3390/ijerph17228534.
7
Subject-independent emotion recognition based on physiological signals: a three-stage decision method.基于生理信号的主体无关情感识别:一种三阶段决策方法。
BMC Med Inform Decis Mak. 2017 Dec 20;17(Suppl 3):167. doi: 10.1186/s12911-017-0562-x.
8
A concept for emotion recognition systems for children with profound intellectual and multiple disabilities based on artificial intelligence using physiological and motion signals.一种基于人工智能、利用生理和运动信号的针对重度智力残疾和多重残疾儿童的情感识别系统概念。
Disabil Rehabil Assist Technol. 2024 May;19(4):1319-1326. doi: 10.1080/17483107.2023.2170478. Epub 2023 Jan 25.
9
Emotion recognition from single-channel EEG signals using a two-stage correlation and instantaneous frequency-based filtering method.基于两级相关和基于瞬时频率的滤波方法从单通道 EEG 信号中进行情绪识别。
Comput Methods Programs Biomed. 2019 May;173:157-165. doi: 10.1016/j.cmpb.2019.03.015. Epub 2019 Mar 22.
10
Automatic Classification of Emotions Based on Cardiac Signals: A Systematic Literature Review.基于心脏信号的情绪自动分类:一项系统文献综述
Ann Biomed Eng. 2023 Nov;51(11):2393-2414. doi: 10.1007/s10439-023-03341-8. Epub 2023 Aug 5.

引用本文的文献

1
Dynamic emotion intensity estimation from physiological signals facilitating interpretation via appraisal theory.基于生理信号的动态情绪强度估计:通过评价理论促进解读
PLoS One. 2025 Jan 24;20(1):e0315929. doi: 10.1371/journal.pone.0315929. eCollection 2025.
2
Personalized Clustering for Emotion Recognition Improvement.用于改善情感识别的个性化聚类
Sensors (Basel). 2024 Dec 19;24(24):8110. doi: 10.3390/s24248110.
3
Multi-Input CNN-LSTM deep learning model for fear level classification based on EEG and peripheral physiological signals.

本文引用的文献

1
Emotion Elicitation Under Audiovisual Stimuli Reception: Should Artificial Intelligence Consider the Gender Perspective?视听刺激接收下的情绪诱发:人工智能是否应该考虑性别视角?
Int J Environ Res Public Health. 2020 Nov 17;17(22):8534. doi: 10.3390/ijerph17228534.
2
Human Emotion Recognition: Review of Sensors and Methods.人类情感识别:传感器与方法综述。
Sensors (Basel). 2020 Jan 21;20(3):592. doi: 10.3390/s20030592.
3
Wearable-Based Affect Recognition-A Review.基于可穿戴设备的情感识别综述
基于脑电图(EEG)和外周生理信号的恐惧水平分类的多输入卷积神经网络-长短期记忆(CNN-LSTM)深度学习模型
Front Psychol. 2023 Jun 1;14:1141801. doi: 10.3389/fpsyg.2023.1141801. eCollection 2023.
4
Gender biases in the training methods of affective computing: Redesign and validation of the Self-Assessment Manikin in measuring emotions audiovisual clips.情感计算训练方法中的性别偏见:用于测量情绪视听片段的自我评估人体模型的重新设计与验证
Front Psychol. 2022 Oct 20;13:955530. doi: 10.3389/fpsyg.2022.955530. eCollection 2022.
5
Fear Detection in Multimodal Affective Computing: Physiological Signals versus Catecholamine Concentration.多模态情感计算中的恐惧检测:生理信号与儿茶酚胺浓度。
Sensors (Basel). 2022 May 26;22(11):4023. doi: 10.3390/s22114023.
6
Machine Learning Methods for Fear Classification Based on Physiological Features.基于生理特征的恐惧分类的机器学习方法。
Sensors (Basel). 2021 Jul 1;21(13):4519. doi: 10.3390/s21134519.
Sensors (Basel). 2019 Sep 20;19(19):4079. doi: 10.3390/s19194079.
4
Fear Level Classification Based on Emotional Dimensions and Machine Learning Techniques.基于情绪维度和机器学习技术的恐惧水平分类。
Sensors (Basel). 2019 Apr 11;19(7):1738. doi: 10.3390/s19071738.
5
Females Are More Sensitive to Opponent's Emotional Feedback: Evidence From Event-Related Potentials.女性对对手的情绪反馈更敏感:来自事件相关电位的证据。
Front Hum Neurosci. 2018 Jul 10;12:275. doi: 10.3389/fnhum.2018.00275. eCollection 2018.
6
A Review of Emotion Recognition Using Physiological Signals.基于生理信号的情感识别研究综述。
Sensors (Basel). 2018 Jun 28;18(7):2074. doi: 10.3390/s18072074.
7
Power Spectral Density Analysis of Electrodermal Activity for Sympathetic Function Assessment.用于交感神经功能评估的皮肤电活动功率谱密度分析
Ann Biomed Eng. 2016 Oct;44(10):3124-3135. doi: 10.1007/s10439-016-1606-6. Epub 2016 Apr 8.
8
What Scientists Who Study Emotion Agree About.情绪研究科学家的共识
Perspect Psychol Sci. 2016 Jan;11(1):31-4. doi: 10.1177/1745691615596992.
9
100% classification accuracy considered harmful: the normalized information transfer factor explains the accuracy paradox.100% 的分类准确率可能有害:归一化信息传递因子解释了准确率悖论。
PLoS One. 2014 Jan 10;9(1):e84217. doi: 10.1371/journal.pone.0084217. eCollection 2014.
10
Gender differences in emotion recognition: Impact of sensory modality and emotional category.性别在情绪识别中的差异:感官模态和情绪类别的影响。
Cogn Emot. 2014 Apr;28(3):452-69. doi: 10.1080/02699931.2013.837378. Epub 2013 Oct 24.