• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

多传感器数据融合与 CNN-LSTM 模型在人体活动识别系统中的应用。

Multi-Sensor Data Fusion and CNN-LSTM Model for Human Activity Recognition System.

机构信息

Academy of Artificial Intelligence, Beijing Institute of Petrochemical Technology, Beijing 102617, China.

Beijing Academy of Safety Engineering and Technology, Beijing 102617, China.

出版信息

Sensors (Basel). 2023 May 14;23(10):4750. doi: 10.3390/s23104750.

DOI:10.3390/s23104750
PMID:37430664
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10221064/
Abstract

Human activity recognition (HAR) is becoming increasingly important, especially with the growing number of elderly people living at home. However, most sensors, such as cameras, do not perform well in low-light environments. To address this issue, we designed a HAR system that combines a camera and a millimeter wave radar, taking advantage of each sensor and a fusion algorithm to distinguish between confusing human activities and to improve accuracy in low-light settings. To extract the spatial and temporal features contained in the multisensor fusion data, we designed an improved CNN-LSTM model. In addition, three data fusion algorithms were studied and investigated. Compared to camera data in low-light environments, the fusion data significantly improved the HAR accuracy by at least 26.68%, 19.87%, and 21.92% under the data level fusion algorithm, feature level fusion algorithm, and decision level fusion algorithm, respectively. Moreover, the data level fusion algorithm also resulted in a reduction of the best misclassification rate to 2%~6%. These findings suggest that the proposed system has the potential to enhance the accuracy of HAR in low-light environments and to decrease human activity misclassification rates.

摘要

人体活动识别(HAR)变得越来越重要,特别是随着越来越多的老年人在家中生活。然而,大多数传感器,如摄像机,在低光照环境下表现不佳。为了解决这个问题,我们设计了一个结合摄像机和毫米波雷达的 HAR 系统,利用每个传感器和融合算法来区分混淆的人体活动,并提高低光照环境下的准确性。为了提取多传感器融合数据中包含的时空特征,我们设计了一个改进的 CNN-LSTM 模型。此外,研究并调查了三种数据融合算法。与低光照环境下的摄像机数据相比,融合数据在数据级融合算法、特征级融合算法和决策级融合算法下,分别将 HAR 准确性至少提高了 26.68%、19.87%和 21.92%。此外,数据级融合算法还将最佳误分类率降低到 2%~6%。这些发现表明,所提出的系统有可能提高低光照环境下 HAR 的准确性,并降低人体活动误分类率。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/c2f9d32f906f/sensors-23-04750-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/0059853466a6/sensors-23-04750-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/3c1f9ef0230d/sensors-23-04750-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/0d9da59f304a/sensors-23-04750-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/a7348da24710/sensors-23-04750-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/8eb2779cb456/sensors-23-04750-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/2ef0324b0911/sensors-23-04750-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/65ab1bd0c249/sensors-23-04750-g007a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/050e26d4364e/sensors-23-04750-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/0ba02203fe83/sensors-23-04750-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/bb88ae8d54cb/sensors-23-04750-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/be494225c262/sensors-23-04750-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/8b786eba68a7/sensors-23-04750-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/e2004f6ed0fe/sensors-23-04750-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/c1898f17a9a4/sensors-23-04750-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/e1fe64f48783/sensors-23-04750-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/753b17d02d7d/sensors-23-04750-g016a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/929da8ae3947/sensors-23-04750-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/643c8dde780c/sensors-23-04750-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/dc57417f562c/sensors-23-04750-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/c2f9d32f906f/sensors-23-04750-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/0059853466a6/sensors-23-04750-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/3c1f9ef0230d/sensors-23-04750-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/0d9da59f304a/sensors-23-04750-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/a7348da24710/sensors-23-04750-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/8eb2779cb456/sensors-23-04750-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/2ef0324b0911/sensors-23-04750-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/65ab1bd0c249/sensors-23-04750-g007a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/050e26d4364e/sensors-23-04750-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/0ba02203fe83/sensors-23-04750-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/bb88ae8d54cb/sensors-23-04750-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/be494225c262/sensors-23-04750-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/8b786eba68a7/sensors-23-04750-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/e2004f6ed0fe/sensors-23-04750-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/c1898f17a9a4/sensors-23-04750-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/e1fe64f48783/sensors-23-04750-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/753b17d02d7d/sensors-23-04750-g016a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/929da8ae3947/sensors-23-04750-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/643c8dde780c/sensors-23-04750-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/dc57417f562c/sensors-23-04750-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0dd0/10221064/c2f9d32f906f/sensors-23-04750-g020.jpg

相似文献

1
Multi-Sensor Data Fusion and CNN-LSTM Model for Human Activity Recognition System.多传感器数据融合与 CNN-LSTM 模型在人体活动识别系统中的应用。
Sensors (Basel). 2023 May 14;23(10):4750. doi: 10.3390/s23104750.
2
Human Activity Recognition Method Based on FMCW Radar Sensor with Multi-Domain Feature Attention Fusion Network.基于 FMCW 雷达传感器与多域特征注意力融合网络的人体活动识别方法。
Sensors (Basel). 2023 May 26;23(11):5100. doi: 10.3390/s23115100.
3
LSTM Networks Using Smartphone Data for Sensor-Based Human Activity Recognition in Smart Homes.基于智能手机数据的 LSTM 网络在智能家居中用于基于传感器的人体活动识别。
Sensors (Basel). 2021 Feb 26;21(5):1636. doi: 10.3390/s21051636.
4
Radar Human Activity Recognition with an Attention-Based Deep Learning Network.基于注意力深度学习网络的雷达人体活动识别。
Sensors (Basel). 2023 Mar 16;23(6):3185. doi: 10.3390/s23063185.
5
Towards a Low-Cost Solution for Gait Analysis Using Millimeter Wave Sensor and Machine Learning.利用毫米波传感器和机器学习实现低成本步态分析
Sensors (Basel). 2022 Jul 22;22(15):5470. doi: 10.3390/s22155470.
6
Multi-Objective Association Detection of Farmland Obstacles Based on Information Fusion of Millimeter Wave Radar and Camera.基于毫米波雷达和相机信息融合的农田障碍物多目标关联检测。
Sensors (Basel). 2022 Dec 26;23(1):230. doi: 10.3390/s23010230.
7
Wearable Sensor-Based Residual Multifeature Fusion Shrinkage Networks for Human Activity Recognition.基于可穿戴传感器的残差多特征融合收缩网络的人体活动识别。
Sensors (Basel). 2024 Jan 24;24(3):758. doi: 10.3390/s24030758.
8
Deep CNN-LSTM With Self-Attention Model for Human Activity Recognition Using Wearable Sensor.基于可穿戴传感器的人类活动识别的深度 CNN-LSTM 自注意力模型。
IEEE J Transl Eng Health Med. 2022 May 25;10:2700316. doi: 10.1109/JTEHM.2022.3177710. eCollection 2022.
9
Improving Human Activity Recognition Performance by Data Fusion and Feature Engineering.通过数据融合和特征工程提高人体活动识别性能。
Sensors (Basel). 2021 Jan 20;21(3):692. doi: 10.3390/s21030692.
10
Human Activity Recognition Based on Deep Learning and Micro-Doppler Radar Data.基于深度学习和微多普勒雷达数据的人类活动识别
Sensors (Basel). 2024 Apr 15;24(8):2530. doi: 10.3390/s24082530.

引用本文的文献

1
Risk classification assessment and early warning of large deformation of soft rock in tunnels based on CNN-LSTM model.基于CNN-LSTM模型的隧道软岩大变形风险分类评估与预警
Sci Rep. 2024 Dec 2;14(1):29944. doi: 10.1038/s41598-024-81816-x.
2
A Review and Tutorial on Machine Learning-Enabled Radar-Based Biomedical Monitoring.基于机器学习的雷达生物医学监测综述与教程
IEEE Open J Eng Med Biol. 2024 May 6;5:680-699. doi: 10.1109/OJEMB.2024.3397208. eCollection 2024.

本文引用的文献

1
Robust Human Activity Recognition by Integrating Image and Accelerometer Sensor Data Using Deep Fusion Network.基于深度融合网络的图像和加速度计传感器数据融合的健壮人体活动识别。
Sensors (Basel). 2021 Dec 28;22(1):174. doi: 10.3390/s22010174.
2
Sensor-Based Human Activity Recognition Using Adaptive Class Hierarchy.基于传感器的自适应类层次结构的人类活动识别
Sensors (Basel). 2021 Nov 21;21(22):7743. doi: 10.3390/s21227743.
3
A Low-Power Fall Detector Balancing Sensitivity and False Alarm Rate.一种低功耗跌倒探测器,平衡灵敏度和误报率。
IEEE J Biomed Health Inform. 2018 Nov;22(6):1929-1937. doi: 10.1109/JBHI.2017.2778271. Epub 2017 Nov 29.
4
Home Camera-Based Fall Detection System for the Elderly.基于家用摄像头的老年人跌倒检测系统。
Sensors (Basel). 2017 Dec 9;17(12):2864. doi: 10.3390/s17122864.
5
Complex Human Activity Recognition Using Smartphone and Wrist-Worn Motion Sensors.使用智能手机和腕戴式运动传感器进行复杂人类活动识别
Sensors (Basel). 2016 Mar 24;16(4):426. doi: 10.3390/s16040426.
6
Feature selection for wearable smartphone-based human activity recognition with able bodied, elderly, and stroke patients.适用于健全人、老年人和中风患者的基于可穿戴智能手机的人体活动识别的特征选择
PLoS One. 2015 Apr 17;10(4):e0124414. doi: 10.1371/journal.pone.0124414. eCollection 2015.
7
An automatic fall detection framework using data fusion of Doppler radar and motion sensor network.一种利用多普勒雷达和运动传感器网络数据融合的自动跌倒检测框架。
Annu Int Conf IEEE Eng Med Biol Soc. 2014;2014:5940-3. doi: 10.1109/EMBC.2014.6944981.
8
Challenges, issues and trends in fall detection systems.跌倒检测系统中的挑战、问题和趋势。
Biomed Eng Online. 2013 Jul 6;12:66. doi: 10.1186/1475-925X-12-66.