• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于手掌定义模型和多分类的手语识别方法。

Sign Language Recognition Method Based on Palm Definition Model and Multiple Classification.

机构信息

Faculty of Information Technologies, L.N. Gumilyov Eurasian National University, Nur-Sultan 010008, Kazakhstan.

Institute of Economics, Information Technologies and Professional Education, Zangir Khan West Kazakhstan Agrarion-Technical University, Uralsk 090000, Kazakhstan.

出版信息

Sensors (Basel). 2022 Sep 1;22(17):6621. doi: 10.3390/s22176621.

DOI:10.3390/s22176621
PMID:36081076
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9460639/
Abstract

Technologies for pattern recognition are used in various fields. One of the most relevant and important directions is the use of pattern recognition technology, such as gesture recognition, in socially significant tasks, to develop automatic sign language interpretation systems in real time. More than 5% of the world's population-about 430 million people, including 34 million children-are deaf-mute and not always able to use the services of a living sign language interpreter. Almost 80% of people with a disabling hearing loss live in low- and middle-income countries. The development of low-cost systems of automatic sign language interpretation, without the use of expensive sensors and unique cameras, would improve the lives of people with disabilities, contributing to their unhindered integration into society. To this end, in order to find an optimal solution to the problem, this article analyzes suitable methods of gesture recognition in the context of their use in automatic gesture recognition systems, to further determine the most optimal methods. From the analysis, an algorithm based on the palm definition model and linear models for recognizing the shapes of numbers and letters of the Kazakh sign language are proposed. The advantage of the proposed algorithm is that it fully recognizes 41 letters of the 42 in the Kazakh sign alphabet. Until this time, only Russian letters in the Kazakh alphabet have been recognized. In addition, a unified function has been integrated into our system to configure the frame depth map mode, which has improved recognition performance and can be used to create a multimodal database of video data of gesture words for the gesture recognition system.

摘要

模式识别技术在各个领域都有应用。其中一个最相关和重要的方向是使用模式识别技术,例如手势识别,来完成具有社会意义的任务,实时开发自动手语翻译系统。全世界有超过 5%的人口,约 4.3 亿人,包括 3400 万失聪人士,他们并不总能使用手语翻译员的服务。近 80%的听力障碍者生活在中低收入国家。开发低成本的自动手语翻译系统,不使用昂贵的传感器和独特的摄像机,将改善残疾人士的生活,促进他们无障碍地融入社会。为此,为了找到问题的最佳解决方案,本文分析了在自动手势识别系统中使用手势识别的合适方法,以进一步确定最优化的方法。在此基础上,提出了一种基于手掌定义模型和线性模型的识别哈萨克手语数字和字母形状的算法。所提出算法的优点在于,它可以完全识别哈萨克字母表中的 41 个字母,而在此之前,哈萨克字母表中的字母只识别俄语字母。此外,我们的系统还集成了一个统一的功能,可以配置帧深度图模式,这提高了识别性能,可用于创建手势识别系统的手势单词视频数据的多模态数据库。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/41487cc01bf1/sensors-22-06621-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/8d268057f745/sensors-22-06621-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/7b9ee6f98f87/sensors-22-06621-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/3c1129259bcb/sensors-22-06621-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/0e4b4c14c942/sensors-22-06621-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/56da571bb4a6/sensors-22-06621-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/b673f2847b2a/sensors-22-06621-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/264a42be1990/sensors-22-06621-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/74cbddc1737c/sensors-22-06621-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/52b92b63593f/sensors-22-06621-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/204b2a04bfc6/sensors-22-06621-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/6b6325c229ce/sensors-22-06621-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/38981a1bcc47/sensors-22-06621-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/28c65e0ebacb/sensors-22-06621-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/08bb9e4ded4a/sensors-22-06621-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/476d71031162/sensors-22-06621-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/a5b6bc74e20b/sensors-22-06621-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/afbbdc7b5acf/sensors-22-06621-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/d8132af34e2b/sensors-22-06621-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/41487cc01bf1/sensors-22-06621-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/8d268057f745/sensors-22-06621-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/7b9ee6f98f87/sensors-22-06621-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/3c1129259bcb/sensors-22-06621-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/0e4b4c14c942/sensors-22-06621-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/56da571bb4a6/sensors-22-06621-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/b673f2847b2a/sensors-22-06621-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/264a42be1990/sensors-22-06621-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/74cbddc1737c/sensors-22-06621-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/52b92b63593f/sensors-22-06621-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/204b2a04bfc6/sensors-22-06621-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/6b6325c229ce/sensors-22-06621-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/38981a1bcc47/sensors-22-06621-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/28c65e0ebacb/sensors-22-06621-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/08bb9e4ded4a/sensors-22-06621-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/476d71031162/sensors-22-06621-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/a5b6bc74e20b/sensors-22-06621-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/afbbdc7b5acf/sensors-22-06621-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/d8132af34e2b/sensors-22-06621-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9da6/9460639/41487cc01bf1/sensors-22-06621-g019.jpg

相似文献

1
Sign Language Recognition Method Based on Palm Definition Model and Multiple Classification.基于手掌定义模型和多分类的手语识别方法。
Sensors (Basel). 2022 Sep 1;22(17):6621. doi: 10.3390/s22176621.
2
A Novel Phonology- and Radical-Coded Chinese Sign Language Recognition Framework Using Accelerometer and Surface Electromyography Sensors.一种使用加速度计和表面肌电图传感器的新颖的基于音韵和部首编码的中国手语识别框架。
Sensors (Basel). 2015 Sep 15;15(9):23303-24. doi: 10.3390/s150923303.
3
Dynamic Japanese Sign Language Recognition Throw Hand Pose Estimation Using Effective Feature Extraction and Classification Approach.基于有效特征提取和分类方法的动态日本手语识别投手姿势估计
Sensors (Basel). 2024 Jan 26;24(3):826. doi: 10.3390/s24030826.
4
A Component-Based Vocabulary-Extensible Sign Language Gesture Recognition Framework.一种基于组件的词汇可扩展手语手势识别框架。
Sensors (Basel). 2016 Apr 19;16(4):556. doi: 10.3390/s16040556.
5
A Kinect-Based Sign Language Hand Gesture Recognition System for Hearing- and Speech-Impaired: A Pilot Study of Pakistani Sign Language.一种基于Kinect的听力和言语障碍者手语手势识别系统:巴基斯坦手语的初步研究
Assist Technol. 2015 Spring;27(1):34-43. doi: 10.1080/10400435.2014.952845.
6
A unified framework for gesture recognition and spatiotemporal gesture segmentation.用于手势识别和时空手势分割的统一框架。
IEEE Trans Pattern Anal Mach Intell. 2009 Sep;31(9):1685-99. doi: 10.1109/TPAMI.2008.203.
7
Sign Language Recognition Using the Electromyographic Signal: A Systematic Literature Review.使用肌电图信号的手语识别:系统文献综述。
Sensors (Basel). 2023 Oct 9;23(19):8343. doi: 10.3390/s23198343.
8
Development of a low-resource wearable continuous gesture-to-speech conversion system.开发一种低资源可穿戴的连续手势到语音转换系统。
Disabil Rehabil Assist Technol. 2023 Nov;18(8):1441-1452. doi: 10.1080/17483107.2021.2022787. Epub 2022 Jan 21.
9
Hand gestures for emergency situations: A video dataset based on words from Indian sign language.紧急情况下的手势:基于印度手语词汇的视频数据集
Data Brief. 2020 Jul 11;31:106016. doi: 10.1016/j.dib.2020.106016. eCollection 2020 Aug.
10
UltrasonicGS: A Highly Robust Gesture and Sign Language Recognition Method Based on Ultrasonic Signals.基于超声信号的高鲁棒性手势和手语识别方法:UltrasonicGS
Sensors (Basel). 2023 Feb 5;23(4):1790. doi: 10.3390/s23041790.

引用本文的文献

1
Unsupervised Clustering and Ensemble Learning for Classifying Lip Articulation in Fingerspelling.用于手指拼写中唇音清晰度分类的无监督聚类与集成学习
Sensors (Basel). 2025 Jun 13;25(12):3703. doi: 10.3390/s25123703.
2
Bioinspired Photoreceptors with Neural Network for Recognition and Classification of Sign Language Gesture.基于神经网络的仿生光感受器,用于识别和分类手语手势。
Sensors (Basel). 2023 Dec 6;23(24):9646. doi: 10.3390/s23249646.
3
Continuous Sign Language Recognition and Its Translation into Intonation-Colored Speech.连续手语识别及其语调色彩语音的翻译。

本文引用的文献

1
Sign Language Recognition Using Wearable Electronics: Implementing k-Nearest Neighbors with Dynamic Time Warping and Convolutional Neural Network Algorithms.使用可穿戴电子设备进行手语识别:实现带动态时间规整和卷积神经网络算法的 k-最近邻算法。
Sensors (Basel). 2020 Jul 11;20(14):3879. doi: 10.3390/s20143879.
2
A Deep Learning-Based End-to-End Composite System for Hand Detection and Gesture Recognition.基于深度学习的手检测与手势识别端到端复合系统。
Sensors (Basel). 2019 Nov 30;19(23):5282. doi: 10.3390/s19235282.
3
A smart glove with integrated triboelectric nanogenerator for self-powered gesture recognition and language expression.
Sensors (Basel). 2023 Jul 13;23(14):6383. doi: 10.3390/s23146383.
4
Audio-Visual Speech and Gesture Recognition by Sensors of Mobile Devices.基于移动设备传感器的视听语音和手势识别。
Sensors (Basel). 2023 Feb 17;23(4):2284. doi: 10.3390/s23042284.
5
A Sign Language Recognition System Applied to Deaf-Mute Medical Consultation.手语识别系统在聋哑人医疗咨询中的应用。
Sensors (Basel). 2022 Nov 24;22(23):9107. doi: 10.3390/s22239107.
一种集成摩擦纳米发电机的智能手套,用于自供电手势识别和语言表达。
Sci Technol Adv Mater. 2019 Sep 11;20(1):964-971. doi: 10.1080/14686996.2019.1665458. eCollection 2019.
4
Random Forest-Based Recognition of Isolated Sign Language Subwords Using Data from Accelerometers and Surface Electromyographic Sensors.基于随机森林的孤立手语子词识别:利用加速度计和表面肌电传感器数据
Sensors (Basel). 2016 Jan 14;16(1):100. doi: 10.3390/s16010100.