• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于智能手机传感器的人类活动识别的集成深度学习模型 Ensem-HAR:用于测量老年人健康监测。

Ensem-HAR: An Ensemble Deep Learning Model for Smartphone Sensor-Based Human Activity Recognition for Measurement of Elderly Health Monitoring.

机构信息

Department of Electronics and Communication Engineering, Techno Main Salt Lake, Salt Lake City, EM-4/1, Sector-V, Kolkata 700091, West Bengal, India.

Department of Information Technology, Jadavpur University Second Campus, Jadavpur University, Plot No. 8, Salt Lake Bypass, LB Block, Sector III, Salt Lake City, Kolkata 700106, West Bengal, India.

出版信息

Biosensors (Basel). 2022 Jun 7;12(6):393. doi: 10.3390/bios12060393.

DOI:10.3390/bios12060393
PMID:35735541
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9221472/
Abstract

Biomedical images contain a huge number of sensor measurements that can provide disease characteristics. Computer-assisted analysis of such parameters aids in the early detection of disease, and as a result aids medical professionals in quickly selecting appropriate medications. Human Activity Recognition, abbreviated as 'HAR', is the prediction of common human measurements, which consist of movements such as walking, running, drinking, cooking, etc. It is extremely advantageous for services in the sphere of medical care, such as fitness trackers, senior care, and archiving patient information for future use. The two types of data that can be fed to the HAR system as input are, first, video sequences or images of human activities, and second, time-series data of physical movements during different activities recorded through sensors such as accelerometers, gyroscopes, etc., that are present in smart gadgets. In this paper, we have decided to work with time-series kind of data as the input. Here, we propose an ensemble of four deep learning-based classification models, namely, 'CNN-net', 'CNNLSTM-net', 'ConvLSTM-net', and 'StackedLSTM-net', which is termed as 'Ensem-HAR'. Each of the classification models used in the ensemble is based on a typical 1D Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) network; however, they differ in terms of their architectural variations. Prediction through the proposed Ensem-HAR is carried out by stacking predictions from each of the four mentioned classification models, then training a Blender or Meta-learner on the stacked prediction, which provides the final prediction on test data. Our proposed model was evaluated over three benchmark datasets, WISDM, PAMAP2, and UCI-HAR; the proposed Ensem-HAR model for biomedical measurement achieved 98.70%, 97.45%, and 95.05% accuracy, respectively, on the mentioned datasets. The results from the experiments reveal that the suggested model performs better than the other multiple generated measurements to which it was compared.

摘要

生物医学图像包含大量可以提供疾病特征的传感器测量值。对这些参数进行计算机辅助分析有助于疾病的早期发现,从而帮助医务人员快速选择合适的药物。人体活动识别,简称“HAR”,是对常见人体测量值的预测,这些测量值包括行走、跑步、喝水、烹饪等动作。它对医疗保健服务领域非常有利,例如健身追踪器、老年人护理以及为将来使用而存档患者信息。可以作为输入提供给 HAR 系统的两种类型的数据是,第一,人类活动的视频序列或图像,第二,通过传感器记录的不同活动期间的物理运动的时间序列数据,例如智能小工具中的加速度计、陀螺仪等。在本文中,我们决定使用时间序列类型的数据作为输入。在这里,我们提出了一个由四个基于深度学习的分类模型组成的集成,即“CNN-net”、“CNNLSTM-net”、“ConvLSTM-net”和“StackedLSTM-net”,称为“Ensem-HAR”。集成中使用的每个分类模型都是基于典型的 1D 卷积神经网络(CNN)和长短时记忆(LSTM)网络构建的;但是,它们在架构上有所不同。通过提出的 Ensem-HAR 进行预测是通过堆叠来自上述四个分类模型的预测来完成的,然后在堆叠的预测上训练一个 Blender 或元学习者,这为测试数据提供最终预测。我们的建议模型在三个基准数据集 WISDM、PAMAP2 和 UCI-HAR 上进行了评估;在所述数据集上,用于生物医学测量的建议 Ensem-HAR 模型分别实现了 98.70%、97.45%和 95.05%的准确度。实验结果表明,所提出的模型比其他多个生成的测量值表现更好。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/55b70b51fa93/biosensors-12-00393-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/f52632d31281/biosensors-12-00393-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/91cf495eec36/biosensors-12-00393-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/425b1c3ee2e5/biosensors-12-00393-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/ebcc200b6e62/biosensors-12-00393-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/4c68b2d5f3db/biosensors-12-00393-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/7f1ae270586a/biosensors-12-00393-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/ad98f25a8324/biosensors-12-00393-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/c8f91bbb896a/biosensors-12-00393-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/e671a19c6e68/biosensors-12-00393-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/c6e0371f6813/biosensors-12-00393-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/7b5494879f66/biosensors-12-00393-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/9b7d4dab852a/biosensors-12-00393-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/f39f53f78a27/biosensors-12-00393-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/25da8ca2b857/biosensors-12-00393-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/ba2d19e0ac3c/biosensors-12-00393-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/32bda005af0d/biosensors-12-00393-g016a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/1c2bcd056417/biosensors-12-00393-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/93d9183aa0ef/biosensors-12-00393-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/916b27dc4fe2/biosensors-12-00393-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/55472159f45f/biosensors-12-00393-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/4497ca77e8c3/biosensors-12-00393-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/c4ec7c38b400/biosensors-12-00393-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/55b70b51fa93/biosensors-12-00393-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/f52632d31281/biosensors-12-00393-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/91cf495eec36/biosensors-12-00393-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/425b1c3ee2e5/biosensors-12-00393-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/ebcc200b6e62/biosensors-12-00393-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/4c68b2d5f3db/biosensors-12-00393-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/7f1ae270586a/biosensors-12-00393-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/ad98f25a8324/biosensors-12-00393-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/c8f91bbb896a/biosensors-12-00393-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/e671a19c6e68/biosensors-12-00393-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/c6e0371f6813/biosensors-12-00393-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/7b5494879f66/biosensors-12-00393-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/9b7d4dab852a/biosensors-12-00393-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/f39f53f78a27/biosensors-12-00393-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/25da8ca2b857/biosensors-12-00393-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/ba2d19e0ac3c/biosensors-12-00393-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/32bda005af0d/biosensors-12-00393-g016a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/1c2bcd056417/biosensors-12-00393-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/93d9183aa0ef/biosensors-12-00393-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/916b27dc4fe2/biosensors-12-00393-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/55472159f45f/biosensors-12-00393-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/4497ca77e8c3/biosensors-12-00393-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/c4ec7c38b400/biosensors-12-00393-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6bb5/9221472/55b70b51fa93/biosensors-12-00393-g023.jpg

相似文献

1
Ensem-HAR: An Ensemble Deep Learning Model for Smartphone Sensor-Based Human Activity Recognition for Measurement of Elderly Health Monitoring.基于智能手机传感器的人类活动识别的集成深度学习模型 Ensem-HAR:用于测量老年人健康监测。
Biosensors (Basel). 2022 Jun 7;12(6):393. doi: 10.3390/bios12060393.
2
HIT HAR: Human Image Threshing Machine for Human Activity Recognition Using Deep Learning Models.利用深度学习模型的人体图像过筛机(HIT HAR):用于人体活动识别。
Comput Intell Neurosci. 2022 Oct 6;2022:1808990. doi: 10.1155/2022/1808990. eCollection 2022.
3
LSTM Networks Using Smartphone Data for Sensor-Based Human Activity Recognition in Smart Homes.基于智能手机数据的 LSTM 网络在智能家居中用于基于传感器的人体活动识别。
Sensors (Basel). 2021 Feb 26;21(5):1636. doi: 10.3390/s21051636.
4
The Convolutional Neural Networks Training With Channel-Selectivity for Human Activity Recognition Based on Sensors.基于传感器的人体活动识别的带通道选择性的卷积神经网络训练。
IEEE J Biomed Health Inform. 2021 Oct;25(10):3834-3843. doi: 10.1109/JBHI.2021.3092396. Epub 2021 Oct 5.
5
Achieving More with Less: A Lightweight Deep Learning Solution for Advanced Human Activity Recognition (HAR).以更少的资源实现更多:高级人体活动识别的轻量级深度学习解决方案。
Sensors (Basel). 2024 Aug 22;24(16):5436. doi: 10.3390/s24165436.
6
Sensor-Based Human Activity Recognition with Spatio-Temporal Deep Learning.基于传感器的人机活动识别的时空深度学习。
Sensors (Basel). 2021 Mar 18;21(6):2141. doi: 10.3390/s21062141.
7
A hybrid TCN-GRU model for classifying human activities using smartphone inertial signals.一种使用智能手机惯性信号对人类活动进行分类的混合 TCN-GRU 模型。
PLoS One. 2024 Aug 13;19(8):e0304655. doi: 10.1371/journal.pone.0304655. eCollection 2024.
8
Deep CNN-LSTM With Self-Attention Model for Human Activity Recognition Using Wearable Sensor.基于可穿戴传感器的人类活动识别的深度 CNN-LSTM 自注意力模型。
IEEE J Transl Eng Health Med. 2022 May 25;10:2700316. doi: 10.1109/JTEHM.2022.3177710. eCollection 2022.
9
Novel Deep Learning Network for Gait Recognition Using Multimodal Inertial Sensors.基于多模态惯性传感器的新型深度学习步态识别网络。
Sensors (Basel). 2023 Jan 11;23(2):849. doi: 10.3390/s23020849.
10
Data Valuation Algorithm for Inertial Measurement Unit-Based Human Activity Recognition.基于惯性测量单元的人类活动识别数据评估算法
Sensors (Basel). 2022 Dec 24;23(1):184. doi: 10.3390/s23010184.

引用本文的文献

1
Schizophrenia detection from electroencephalogram signals using image encoding and wrapper-based deep feature selection approach.基于图像编码和基于包装器的深度特征选择方法从脑电图信号中检测精神分裂症。
Sci Rep. 2025 Jul 1;15(1):21390. doi: 10.1038/s41598-025-06121-7.
2
LightGBM-Based Human Action Recognition Using Sensors.基于LightGBM的利用传感器的人类行为识别
Sensors (Basel). 2025 Jun 13;25(12):3704. doi: 10.3390/s25123704.
3
A Comparative Study of Machine Learning and Deep Learning Models for Automatic Parkinson's Disease Detection from Electroencephalogram Signals.

本文引用的文献

1
Human Activity Recognition Based on Residual Network and BiLSTM.基于残差网络和双向长短时记忆网络的人体活动识别。
Sensors (Basel). 2022 Jan 14;22(2):635. doi: 10.3390/s22020635.
2
COFE-Net: An ensemble strategy for Computer-Aided Detection for COVID-19.COFE-Net:一种用于新冠病毒肺炎计算机辅助检测的集成策略。
Measurement (Lond). 2022 Jan;187:110289. doi: 10.1016/j.measurement.2021.110289. Epub 2021 Oct 14.
3
Coarse-Fine Convolutional Deep-Learning Strategy for Human Activity Recognition.粗-细卷积深度学习策略在人体活动识别中的应用。
基于脑电图信号的帕金森病自动检测中机器学习与深度学习模型的比较研究
Diagnostics (Basel). 2025 Mar 19;15(6):773. doi: 10.3390/diagnostics15060773.
4
A Multi-Agent and Attention-Aware Enhanced CNN-BiLSTM Model for Human Activity Recognition for Enhanced Disability Assistance.一种用于增强残疾辅助的人类活动识别的多智能体与注意力感知增强型CNN-BiLSTM模型。
Diagnostics (Basel). 2025 Feb 22;15(5):537. doi: 10.3390/diagnostics15050537.
5
Schizophrenia Detection and Classification: A Systematic Review of the Last Decade.精神分裂症的检测与分类:过去十年的系统综述
Diagnostics (Basel). 2024 Nov 29;14(23):2698. doi: 10.3390/diagnostics14232698.
6
TCN-attention-HAR: human activity recognition based on attention mechanism time convolutional network.TCN-attention-HAR:基于注意力机制时间卷积网络的人体活动识别。
Sci Rep. 2024 Mar 28;14(1):7414. doi: 10.1038/s41598-024-57912-3.
7
A Smartphone-Based sEMG Signal Analysis System for Human Action Recognition.基于智能手机的肌电信号分析系统用于人体动作识别。
Biosensors (Basel). 2023 Aug 11;13(8):805. doi: 10.3390/bios13080805.
8
Hybrid convolution neural network with channel attention mechanism for sensor-based human activity recognition.基于传感器的人体活动识别的带通道注意力机制的混合卷积神经网络。
Sci Rep. 2023 Jul 26;13(1):12067. doi: 10.1038/s41598-023-39080-y.
9
Human Activity Recognition Using Hybrid Coronavirus Disease Optimization Algorithm for Internet of Medical Things.基于混合冠状病毒病优化算法的物联网人体活动识别。
Sensors (Basel). 2023 Jun 24;23(13):5862. doi: 10.3390/s23135862.
10
Machine Learning Electrocardiogram for Mobile Cardiac Pattern Extraction.机器学习在移动心脏模式提取中的心电图应用。
Sensors (Basel). 2023 Jun 19;23(12):5723. doi: 10.3390/s23125723.
Sensors (Basel). 2019 Mar 31;19(7):1556. doi: 10.3390/s19071556.
4
MBOSS: A Symbolic Representation of Human Activity Recognition Using Mobile Sensors.使用移动传感器的人类活动识别的符号表示法:MBOSS
Sensors (Basel). 2018 Dec 10;18(12):4354. doi: 10.3390/s18124354.