• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

临床环境中的实时手势识别:一种具有多特征融合的低功耗FMCW雷达集成传感器系统

Real-Time Hand Gesture Recognition in Clinical Settings: A Low-Power FMCW Radar Integrated Sensor System with Multiple Feature Fusion.

作者信息

Wang Haili, Zhang Muye, Zhang Linghao, Zhu Xiaoxiao, Cao Qixin

机构信息

The State Key Laboratory of Mechanical System and Vibration, Shanghai Jiao Tong University, Shanghai 200240, China.

SJTU Paris Elite Institute of Technology, Shanghai Jiao Tong University, Shanghai 200240, China.

出版信息

Sensors (Basel). 2025 Jul 4;25(13):4169. doi: 10.3390/s25134169.

DOI:10.3390/s25134169
PMID:40648424
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12252275/
Abstract

Robust and efficient contactless human-machine interaction is critical for integrated sensor systems in clinical settings, demanding low-power solutions adaptable to edge computing platforms. This paper presents a real-time hand gesture recognition system using a low-power Frequency-Modulated Continuous Wave (FMCW) radar sensor, featuring a novel Multiple Feature Fusion (MFF) framework optimized for deployment on edge devices. The proposed system integrates velocity profiles, angular variations, and spatial-temporal features through a dual-stage processing architecture: an adaptive energy thresholding detector segments gestures, followed by an attention-enhanced neural classifier. Innovations include dynamic clutter suppression and multi-path cancellation optimized for complex clinical environments. Experimental validation demonstrates high performance, achieving 98% detection recall and 93.87% classification accuracy under LOSO cross-validation. On embedded hardware, the system processes at 28 FPS, showing higher robustness against environmental noise and lower computational overhead compared with existing methods. This low-power, edge-based solution is highly suitable for applications like sterile medical control and patient monitoring, advancing contactless interaction in healthcare by addressing efficiency and robustness challenges in radar sensing for edge computing.

摘要

强大且高效的非接触式人机交互对于临床环境中的集成传感器系统至关重要,这需要适用于边缘计算平台的低功耗解决方案。本文提出了一种使用低功耗调频连续波(FMCW)雷达传感器的实时手势识别系统,其具有针对边缘设备部署进行优化的新型多特征融合(MFF)框架。所提出的系统通过双阶段处理架构集成速度剖面、角度变化和时空特征:自适应能量阈值检测器对手势进行分割,随后是注意力增强神经分类器。创新之处包括针对复杂临床环境优化的动态杂波抑制和多径消除。实验验证表明该系统具有高性能,在留一法交叉验证下实现了98%的检测召回率和93.87%的分类准确率。在嵌入式硬件上,该系统以28帧/秒的速度进行处理,与现有方法相比,对环境噪声具有更高的鲁棒性且计算开销更低。这种基于边缘的低功耗解决方案非常适合无菌医疗控制和患者监测等应用,通过解决边缘计算雷达传感中的效率和鲁棒性挑战,推动了医疗保健中的非接触式交互。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/51339b7d573a/sensors-25-04169-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/bbfa57208680/sensors-25-04169-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/ac13e88c95c3/sensors-25-04169-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/a94077aaf01a/sensors-25-04169-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/cc804ff9e18c/sensors-25-04169-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/3451c49103a7/sensors-25-04169-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/499a16002611/sensors-25-04169-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/6076e25d7720/sensors-25-04169-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/6584ae7ba130/sensors-25-04169-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/0eafb7324c2b/sensors-25-04169-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/588158b2ad16/sensors-25-04169-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/d7e021014f4e/sensors-25-04169-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/1d173f76225a/sensors-25-04169-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/e06f04bd51db/sensors-25-04169-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/2f28f36e4124/sensors-25-04169-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/f075f6292fdb/sensors-25-04169-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/61595351130d/sensors-25-04169-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/4f215e6dc72b/sensors-25-04169-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/3845900b24b8/sensors-25-04169-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/51339b7d573a/sensors-25-04169-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/bbfa57208680/sensors-25-04169-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/ac13e88c95c3/sensors-25-04169-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/a94077aaf01a/sensors-25-04169-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/cc804ff9e18c/sensors-25-04169-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/3451c49103a7/sensors-25-04169-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/499a16002611/sensors-25-04169-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/6076e25d7720/sensors-25-04169-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/6584ae7ba130/sensors-25-04169-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/0eafb7324c2b/sensors-25-04169-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/588158b2ad16/sensors-25-04169-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/d7e021014f4e/sensors-25-04169-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/1d173f76225a/sensors-25-04169-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/e06f04bd51db/sensors-25-04169-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/2f28f36e4124/sensors-25-04169-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/f075f6292fdb/sensors-25-04169-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/61595351130d/sensors-25-04169-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/4f215e6dc72b/sensors-25-04169-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/3845900b24b8/sensors-25-04169-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/598d/12252275/51339b7d573a/sensors-25-04169-g019.jpg

相似文献

1
Real-Time Hand Gesture Recognition in Clinical Settings: A Low-Power FMCW Radar Integrated Sensor System with Multiple Feature Fusion.临床环境中的实时手势识别:一种具有多特征融合的低功耗FMCW雷达集成传感器系统
Sensors (Basel). 2025 Jul 4;25(13):4169. doi: 10.3390/s25134169.
2
Gesture recognition for hearing impaired people using an ensemble of deep learning models with improving beluga whale optimization-based hyperparameter tuning.基于改进的白鲸优化超参数调优的深度学习模型集成用于听力障碍者的手势识别
Sci Rep. 2025 Jul 1;15(1):21441. doi: 10.1038/s41598-025-06680-9.
3
An investigation of multimodal EMG-EEG fusion strategies for upper-limb gesture classification.用于上肢手势分类的多模态肌电图-脑电图融合策略研究。
J Neural Eng. 2025 Jul 10;22(4). doi: 10.1088/1741-2552/ade1f9.
4
Leveraging multithreading on edge computing for smart healthcare based on intelligent multimodal classification approach.基于智能多模态分类方法,在边缘计算上利用多线程实现智能医疗保健。
Comput Med Imaging Graph. 2025 Jul 1;124:102594. doi: 10.1016/j.compmedimag.2025.102594.
5
Dynamic Hand Gesture Recognition in In-Vehicle Environment Based on FMCW Radar and Transformer.基于 FMCW 雷达和转换器的车载环境下动态手势识别
Sensors (Basel). 2021 Sep 24;21(19):6368. doi: 10.3390/s21196368.
6
Recognizing American Sign Language gestures efficiently and accurately using a hybrid transformer model.使用混合变压器模型高效准确地识别美国手语手势。
Sci Rep. 2025 Jun 23;15(1):20253. doi: 10.1038/s41598-025-06344-8.
7
A robust neural prosthetic control strategy against arm position variability and fatigue based on multi-sensor fusion.一种基于多传感器融合的、针对手臂位置变异性和疲劳的强大神经假体控制策略。
J Neural Eng. 2025 Jul 1;22(3). doi: 10.1088/1741-2552/ade504.
8
Highly-Optimized Radar-Based Gesture Recognition System with Depthwise Expansion Module.基于高度优化的雷达的手势识别系统,具有深度扩展模块。
Sensors (Basel). 2021 Nov 2;21(21):7298. doi: 10.3390/s21217298.
9
A Multi-Feature Fusion Approach for Road Surface Recognition Leveraging Millimeter-Wave Radar.一种利用毫米波雷达的路面识别多特征融合方法。
Sensors (Basel). 2025 Jun 18;25(12):3802. doi: 10.3390/s25123802.
10
Multi-Scale Attention Fusion Gesture-Recognition Algorithm Based on Strain Sensors.基于应变传感器的多尺度注意力融合手势识别算法
Sensors (Basel). 2025 Jul 5;25(13):4200. doi: 10.3390/s25134200.

本文引用的文献

1
Continuous Arabic Sign Language Recognition Models.连续阿拉伯手语识别模型。
Sensors (Basel). 2025 May 5;25(9):2916. doi: 10.3390/s25092916.
2
Hand Gesture Recognition on Edge Devices: Sensor Technologies, Algorithms, and Processing Hardware.边缘设备上的手势识别:传感器技术、算法与处理硬件
Sensors (Basel). 2025 Mar 8;25(6):1687. doi: 10.3390/s25061687.
3
An Overview of Dentist-Patient Communication in Quality Dental Care.优质牙科护理中牙医与患者沟通概述
Dent J (Basel). 2025 Jan 14;13(1):31. doi: 10.3390/dj13010031.
4
Static and Dynamic Hand Gestures: A Review of Techniques of Virtual Reality Manipulation.静态和动态手势:虚拟现实操作技术综述。
Sensors (Basel). 2024 Jun 9;24(12):3760. doi: 10.3390/s24123760.
5
Hand Gesture Recognition Using FSK Radar Sensors.基于移频键控雷达传感器的手势识别
Sensors (Basel). 2024 Jan 6;24(2):349. doi: 10.3390/s24020349.
6
Implementing a Hand Gesture Recognition System Based on Range-Doppler Map.基于距离-多普勒图的手势识别系统的实现。
Sensors (Basel). 2022 Jun 2;22(11):4260. doi: 10.3390/s22114260.
7
Highly-Optimized Radar-Based Gesture Recognition System with Depthwise Expansion Module.基于高度优化的雷达的手势识别系统,具有深度扩展模块。
Sensors (Basel). 2021 Nov 2;21(21):7298. doi: 10.3390/s21217298.
8
Gesture Recognition in Robotic Surgery: A Review.机器人手术中的手势识别:综述。
IEEE Trans Biomed Eng. 2021 Jun;68(6):2021-2035. doi: 10.1109/TBME.2021.3054828. Epub 2021 May 21.
9
A Frame Detection Method for Real-Time Hand Gesture Recognition Systems Using CW-Radar.基于连续波雷达的实时手势识别系统的目标检测方法
Sensors (Basel). 2020 Apr 18;20(8):2321. doi: 10.3390/s20082321.
10
Communicating with Patients with Special Health Care Needs.与有特殊医疗需求的患者沟通。
Dent Clin North Am. 2016 Jul;60(3):693-705. doi: 10.1016/j.cden.2016.02.004. Epub 2016 Mar 21.