• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于深度的 Kinect® 传感器跌倒检测系统。

A depth-based fall detection system using a Kinect® sensor.

机构信息

Dipartimento di Ingegneria dell'Informazione, Università Politecnica delle Marche, Via Brecce Bianche 12, Ancona 60131, Italy.

出版信息

Sensors (Basel). 2014 Feb 11;14(2):2756-75. doi: 10.3390/s140202756.

DOI:10.3390/s140202756
PMID:24521943
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC3958279/
Abstract

We propose an automatic, privacy-preserving, fall detection method for indoor environments, based on the usage of the Microsoft Kinect® depth sensor, in an "on-ceiling" configuration, and on the analysis of depth frames. All the elements captured in the depth scene are recognized by means of an Ad-Hoc segmentation algorithm, which analyzes the raw depth data directly provided by the sensor. The system extracts the elements, and implements a solution to classify all the blobs in the scene. Anthropometric relationships and features are exploited to recognize one or more human subjects among the blobs. Once a person is detected, he is followed by a tracking algorithm between different frames. The use of a reference depth frame, containing the set-up of the scene, allows one to extract a human subject, even when he/she is interacting with other objects, such as chairs or desks. In addition, the problem of blob fusion is taken into account and efficiently solved through an inter-frame processing algorithm. A fall is detected if the depth blob associated to a person is near to the floor. Experimental tests show the effectiveness of the proposed solution, even in complex scenarios.

摘要

我们提出了一种基于 Microsoft Kinect®深度传感器的自动、隐私保护、室内环境下的跌倒检测方法,该方法采用“天花板”配置,并对深度帧进行分析。通过一种特定于用途的分割算法,对深度场景中捕获的所有元素进行识别,该算法直接分析传感器提供的原始深度数据。系统提取元素,并实现一种解决方案,对场景中的所有斑点进行分类。利用人体测量学关系和特征来识别斑点中的一个或多个人体对象。一旦检测到一个人,就会在不同的帧之间使用跟踪算法对其进行跟踪。使用包含场景设置的参考深度帧,可以提取一个人体对象,即使他/她正在与其他对象(如椅子或桌子)进行交互。此外,还考虑到了斑点融合的问题,并通过帧间处理算法有效地解决了该问题。如果与人体相关的深度斑点接近地面,则会检测到跌倒。实验测试表明,即使在复杂的场景中,该解决方案也是有效的。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/b94e6c863129/sensors-14-02756f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/2f0ac05efe9a/sensors-14-02756f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/407455be0c8f/sensors-14-02756f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/18ae83212984/sensors-14-02756f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/5c6fdc34886f/sensors-14-02756f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/1d3dbc347b59/sensors-14-02756f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/9ed9c139bb11/sensors-14-02756f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/f31bdffe5d10/sensors-14-02756f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/b94e6c863129/sensors-14-02756f8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/2f0ac05efe9a/sensors-14-02756f1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/407455be0c8f/sensors-14-02756f2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/18ae83212984/sensors-14-02756f3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/5c6fdc34886f/sensors-14-02756f4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/1d3dbc347b59/sensors-14-02756f5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/9ed9c139bb11/sensors-14-02756f6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/f31bdffe5d10/sensors-14-02756f7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a186/3958279/b94e6c863129/sensors-14-02756f8.jpg

相似文献

1
A depth-based fall detection system using a Kinect® sensor.基于深度的 Kinect® 传感器跌倒检测系统。
Sensors (Basel). 2014 Feb 11;14(2):2756-75. doi: 10.3390/s140202756.
2
Accurate Fall Detection in a Top View Privacy Preserving Configuration.在顶部视图隐私保护配置中进行准确的跌倒检测。
Sensors (Basel). 2018 May 29;18(6):1754. doi: 10.3390/s18061754.
3
Depth-color fusion strategy for 3-D scene modeling with Kinect.基于 Kinect 的三维场景建模的深度-颜色融合策略。
IEEE Trans Cybern. 2013 Dec;43(6):1560-71. doi: 10.1109/TCYB.2013.2271112.
4
New Fast Fall Detection Method Based on Spatio-Temporal Context Tracking of Head by Using Depth Images.基于深度图像的头部时空上下文跟踪的新型快速跌倒检测方法
Sensors (Basel). 2015 Sep 11;15(9):23004-19. doi: 10.3390/s150923004.
5
Vision-based finger detection, tracking, and event identification techniques for multi-touch sensing and display systems.基于视觉的手指检测、跟踪和事件识别技术,用于多点触摸感测和显示系统。
Sensors (Basel). 2011;11(7):6868-92. doi: 10.3390/s110706868. Epub 2011 Jul 1.
6
Kinect as a tool for gait analysis: validation of a real-time joint extraction algorithm working in side view.Kinect作为步态分析工具:对一种在侧视图中工作的实时关节提取算法的验证
Sensors (Basel). 2015 Jan 14;15(1):1417-34. doi: 10.3390/s150101417.
7
Calibration of Kinect for Xbox One and Comparison between the Two Generations of Microsoft Sensors.Xbox One的Kinect校准及两代微软传感器的比较。
Sensors (Basel). 2015 Oct 30;15(11):27569-89. doi: 10.3390/s151127569.
8
Detecting human falls with 3-axis accelerometer and depth sensor.利用三轴加速度计和深度传感器检测人体跌倒。
Annu Int Conf IEEE Eng Med Biol Soc. 2014;2014:770-3. doi: 10.1109/EMBC.2014.6943704.
9
Continuous detection of human fall using multimodal features from Kinect sensors in scalable environment.利用 Kinect 传感器的多模态特征在可扩展环境中进行连续的人体跌倒检测。
Comput Methods Programs Biomed. 2017 Jul;146:151-165. doi: 10.1016/j.cmpb.2017.05.007. Epub 2017 May 25.
10
The validity of the first and second generation Microsoft Kinect™ for identifying joint center locations during static postures.第一代和第二代Microsoft Kinect™在识别静态姿势时关节中心位置方面的有效性。
Appl Ergon. 2015 Jul;49:47-54. doi: 10.1016/j.apergo.2015.01.005. Epub 2015 Feb 17.

引用本文的文献

1
An Approach to Fall Detection Using Statistical Distributions of Thermal Signatures Obtained by a Stand-Alone Low-Resolution IR Array Sensor Device.一种利用独立低分辨率红外阵列传感器设备获取的热特征统计分布进行跌倒检测的方法。
Sensors (Basel). 2025 Jan 16;25(2):504. doi: 10.3390/s25020504.
2
UMAHand: A dataset of inertial signals of typical hand activities.UMAHand:典型手部活动的惯性信号数据集。
Data Brief. 2024 Jul 10;55:110731. doi: 10.1016/j.dib.2024.110731. eCollection 2024 Aug.
3
In-Home Older Adults' Activity Pattern Monitoring Using Depth Sensors: A Review.

本文引用的文献

1
Challenges, issues and trends in fall detection systems.跌倒检测系统中的挑战、问题和趋势。
Biomed Eng Online. 2013 Jul 6;12:66. doi: 10.1186/1475-925X-12-66.
2
Portable preimpact fall detector with inertial sensors.带有惯性传感器的便携式撞击前跌倒探测器。
IEEE Trans Neural Syst Rehabil Eng. 2008 Apr;16(2):178-83. doi: 10.1109/TNSRE.2007.916282.
基于深度传感器的居家老年人活动模式监测:综述
Sensors (Basel). 2022 Nov 23;22(23):9067. doi: 10.3390/s22239067.
4
A Class-Imbalanced Deep Learning Fall Detection Algorithm Using Wearable Sensors.基于可穿戴传感器的不平衡深度学习跌倒检测算法。
Sensors (Basel). 2021 Sep 29;21(19):6511. doi: 10.3390/s21196511.
5
NT-FDS-A Noise Tolerant Fall Detection System Using Deep Learning on Wearable Devices.基于可穿戴设备的深度学习的抗噪跌倒检测系统(NT-FDS-A)
Sensors (Basel). 2021 Mar 12;21(6):2006. doi: 10.3390/s21062006.
6
Elderly Fall Detection Systems: A Literature Survey.老年人跌倒检测系统:文献综述
Front Robot AI. 2020 Jun 23;7:71. doi: 10.3389/frobt.2020.00071. eCollection 2020.
7
On the Heterogeneity of Existing Repositories of Movements Intended for the Evaluation of Fall Detection Systems.现有的运动资料库在评估跌倒检测系统时存在异质性。
J Healthc Eng. 2020 Nov 30;2020:6622285. doi: 10.1155/2020/6622285. eCollection 2020.
8
A Two-Stage Fall Recognition Algorithm Based on Human Posture Features.基于人体姿态特征的两阶段跌倒识别算法。
Sensors (Basel). 2020 Dec 5;20(23):6966. doi: 10.3390/s20236966.
9
Automatic Pose Recognition for Monitoring Dangerous Situations in Ambient-Assisted Living.用于监测环境辅助生活中危险情况的自动姿态识别
Front Bioeng Biotechnol. 2020 May 14;8:415. doi: 10.3389/fbioe.2020.00415. eCollection 2020.
10
A Study on the Application of Convolutional Neural Networks to Fall Detection Evaluated with Multiple Public Datasets.卷积神经网络在使用多个公共数据集评估跌倒检测中的应用研究。
Sensors (Basel). 2020 Mar 6;20(5):1466. doi: 10.3390/s20051466.