• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

数据手套通过可穿戴惯性传感器实现听障人士和言语障碍人士的手语识别。

Dataglove for Sign Language Recognition of People with Hearing and Speech Impairment via Wearable Inertial Sensors.

机构信息

Asset Management Department, Ketai Lexun (Beijing) Communication Equipment Co., Ltd., Beijing 101111, China.

Scientific and Technological Innovation Center, Beijing 100012, China.

出版信息

Sensors (Basel). 2023 Jul 26;23(15):6693. doi: 10.3390/s23156693.

DOI:10.3390/s23156693
PMID:37571476
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10422613/
Abstract

Finding ways to enable seamless communication between deaf and able-bodied individuals has been a challenging and pressing issue. This paper proposes a solution to this problem by designing a low-cost data glove that utilizes multiple inertial sensors with the purpose of achieving efficient and accurate sign language recognition. In this study, four machine learning models-decision tree (DT), support vector machine (SVM), K-nearest neighbor method (KNN), and random forest (RF)-were employed to recognize 20 different types of dynamic sign language data used by deaf individuals. Additionally, a proposed attention-based mechanism of long and short-term memory neural networks (Attention-BiLSTM) was utilized in the process. Furthermore, this study verifies the impact of the number and position of data glove nodes on the accuracy of recognizing complex dynamic sign language. Finally, the proposed method is compared with existing state-of-the-art algorithms using nine public datasets. The results indicate that both the Attention-BiLSTM and RF algorithms have the highest performance in recognizing the twenty dynamic sign language gestures, with an accuracy of 98.85% and 97.58%, respectively. This provides evidence for the feasibility of our proposed data glove and recognition methods. This study may serve as a valuable reference for the development of wearable sign language recognition devices and promote easier communication between deaf and able-bodied individuals.

摘要

实现聋人和健全人之间无缝沟通一直是一个具有挑战性和紧迫性的问题。本文提出了一种解决方案,设计了一种低成本的数据手套,利用多个惯性传感器实现高效、准确的手语识别。本研究采用决策树(DT)、支持向量机(SVM)、K-近邻方法(KNN)和随机森林(RF)等四种机器学习模型来识别聋人使用的 20 种不同类型的动态手语数据。此外,还利用了基于注意力的长短时记忆神经网络(Attention-BiLSTM)提出的注意机制。此外,本研究还验证了数据手套节点的数量和位置对识别复杂动态手语准确性的影响。最后,将所提出的方法与使用九个公共数据集的现有最先进算法进行了比较。结果表明,Attention-BiLSTM 和 RF 算法在识别 20 种动态手语手势方面表现最好,准确率分别为 98.85%和 97.58%。这为我们提出的数据手套和识别方法的可行性提供了证据。本研究可为可穿戴式手语识别设备的开发提供有价值的参考,促进聋人和健全人之间的更轻松交流。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/ce627aed21ca/sensors-23-06693-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/2d72a093b157/sensors-23-06693-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/bba1122d8a0e/sensors-23-06693-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/f9672d0460ed/sensors-23-06693-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/dde32b7c0ee6/sensors-23-06693-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/bd6da48164c9/sensors-23-06693-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/381cf86878ff/sensors-23-06693-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/0c7c84237149/sensors-23-06693-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/b06a99c2db13/sensors-23-06693-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/9dac0919d772/sensors-23-06693-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/659462238da5/sensors-23-06693-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/cfc124e54b0f/sensors-23-06693-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/221e4d95b2de/sensors-23-06693-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/bbb30c5400e5/sensors-23-06693-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/174de30f711e/sensors-23-06693-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/dc949678b1ca/sensors-23-06693-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/eb8887fe0b0e/sensors-23-06693-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/e824c319e4de/sensors-23-06693-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/5bd3d8867f06/sensors-23-06693-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/ce627aed21ca/sensors-23-06693-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/2d72a093b157/sensors-23-06693-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/bba1122d8a0e/sensors-23-06693-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/f9672d0460ed/sensors-23-06693-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/dde32b7c0ee6/sensors-23-06693-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/bd6da48164c9/sensors-23-06693-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/381cf86878ff/sensors-23-06693-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/0c7c84237149/sensors-23-06693-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/b06a99c2db13/sensors-23-06693-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/9dac0919d772/sensors-23-06693-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/659462238da5/sensors-23-06693-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/cfc124e54b0f/sensors-23-06693-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/221e4d95b2de/sensors-23-06693-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/bbb30c5400e5/sensors-23-06693-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/174de30f711e/sensors-23-06693-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/dc949678b1ca/sensors-23-06693-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/eb8887fe0b0e/sensors-23-06693-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/e824c319e4de/sensors-23-06693-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/5bd3d8867f06/sensors-23-06693-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dfcd/10422613/ce627aed21ca/sensors-23-06693-g019.jpg

相似文献

1
Dataglove for Sign Language Recognition of People with Hearing and Speech Impairment via Wearable Inertial Sensors.数据手套通过可穿戴惯性传感器实现听障人士和言语障碍人士的手语识别。
Sensors (Basel). 2023 Jul 26;23(15):6693. doi: 10.3390/s23156693.
2
Development of a low-resource wearable continuous gesture-to-speech conversion system.开发一种低资源可穿戴的连续手势到语音转换系统。
Disabil Rehabil Assist Technol. 2023 Nov;18(8):1441-1452. doi: 10.1080/17483107.2021.2022787. Epub 2022 Jan 21.
3
American Sign Language Recognition and Translation Using Perception Neuron Wearable Inertial Motion Capture System.基于感知神经元可穿戴惯性运动捕捉系统的美国手语识别与翻译。
Sensors (Basel). 2024 Jan 11;24(2):453. doi: 10.3390/s24020453.
4
Wearable Sensor-Based Sign Language Recognition: A Comprehensive Review.基于可穿戴传感器的手语识别:全面综述。
IEEE Rev Biomed Eng. 2021;14:82-97. doi: 10.1109/RBME.2020.3019769. Epub 2021 Jan 26.
5
MEMS Devices-Based Hand Gesture Recognition via Wearable Computing.基于MEMS器件的可穿戴计算手势识别
Micromachines (Basel). 2023 Apr 27;14(5):947. doi: 10.3390/mi14050947.
6
Sign Language Recognition Using Wearable Electronics: Implementing k-Nearest Neighbors with Dynamic Time Warping and Convolutional Neural Network Algorithms.使用可穿戴电子设备进行手语识别:实现带动态时间规整和卷积神经网络算法的 k-最近邻算法。
Sensors (Basel). 2020 Jul 11;20(14):3879. doi: 10.3390/s20143879.
7
A wearable hand gesture recognition device based on acoustic measurements at wrist.一种基于手腕声学测量的可穿戴手势识别设备。
Annu Int Conf IEEE Eng Med Biol Soc. 2017 Jul;2017:4443-4446. doi: 10.1109/EMBC.2017.8037842.
8
Sensor Fusion of Motion-Based Sign Language Interpretation with Deep Learning.基于运动的手语解释的传感器融合与深度学习。
Sensors (Basel). 2020 Nov 2;20(21):6256. doi: 10.3390/s20216256.
9
Novel Wearable System to Recognize Sign Language in Real Time.新型可穿戴系统可实时识别手语。
Sensors (Basel). 2024 Jul 16;24(14):4613. doi: 10.3390/s24144613.
10
Dynamic Japanese Sign Language Recognition Throw Hand Pose Estimation Using Effective Feature Extraction and Classification Approach.基于有效特征提取和分类方法的动态日本手语识别投手姿势估计
Sensors (Basel). 2024 Jan 26;24(3):826. doi: 10.3390/s24030826.

引用本文的文献

1
Sign language recognition based on dual-channel star-attention convolutional neural network.基于双通道星型注意力卷积神经网络的手语识别
Sci Rep. 2025 Jul 29;15(1):27685. doi: 10.1038/s41598-025-13625-9.

本文引用的文献

1
MEMS Devices-Based Hand Gesture Recognition via Wearable Computing.基于MEMS器件的可穿戴计算手势识别
Micromachines (Basel). 2023 Apr 27;14(5):947. doi: 10.3390/mi14050947.
2
Real-Time Hand Gesture Recognition Using Fine-Tuned Convolutional Neural Network.基于微调卷积神经网络的实时手势识别。
Sensors (Basel). 2022 Jan 18;22(3):706. doi: 10.3390/s22030706.
3
Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition.用于多模态可穿戴活动识别的深度卷积和长短期记忆循环神经网络
Sensors (Basel). 2016 Jan 18;16(1):115. doi: 10.3390/s16010115.