• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于单 RGB-D 相机的多人跟踪的阴影运用。

Employing Shadows for Multi-Person Tracking Based on a Single RGB-D Camera.

机构信息

School of Software, Shandong University, Jinan 250101, China.

School of Information Science and Engineering, Shandong Normal University, Jinan 250358, China.

出版信息

Sensors (Basel). 2020 Feb 15;20(4):1056. doi: 10.3390/s20041056.

DOI:10.3390/s20041056
PMID:32075274
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7070640/
Abstract

Although there are many algorithms to track people that are walking, existing methods mostly fail to cope with occluded bodies in the setting of multi-person tracking with one camera. In this paper, we propose a method to use people's shadows as a clue to track them instead of treating shadows as mere noise. We introduce a novel method to track multiple people by fusing shadow data from the RGB image with skeleton data, both of which are captured by a single RGB Depth (RGB-D) camera. Skeletal tracking provides the positions of people that can be captured directly, while their shadows are used to track them when they are no longer visible. Our experiments confirm that this method can efficiently handle full occlusions. It thus has substantial value in resolving the occlusion problem in multi-person tracking, even with other kinds of cameras.

摘要

尽管有许多用于跟踪行人的算法,但现有的方法在使用单目相机进行多人跟踪的情况下,大多无法应对被遮挡的人体。在本文中,我们提出了一种使用人体阴影作为线索进行跟踪的方法,而不是将阴影视为纯粹的噪声。我们引入了一种新颖的方法,通过融合来自 RGB 图像的阴影数据和骨骼数据来跟踪多个人,这两种数据都是由单个 RGB-D(红绿蓝-深度)相机捕获的。骨骼跟踪提供了可以直接捕获的人体位置,而当人体不可见时,则使用其阴影来跟踪他们。我们的实验证实,这种方法可以有效地处理完全遮挡。因此,它在解决多人跟踪中的遮挡问题方面具有重要价值,甚至可以应用于其他类型的相机。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/4bdf9767622b/sensors-20-01056-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/ddba1a1c0656/sensors-20-01056-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/465948ce59ba/sensors-20-01056-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/9bdafeeaa12b/sensors-20-01056-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/f6845bda505a/sensors-20-01056-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/9f1f5832d3e7/sensors-20-01056-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/24bdef8a9199/sensors-20-01056-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/cbb788654c7b/sensors-20-01056-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/83f7211de59e/sensors-20-01056-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/4b947caa18ae/sensors-20-01056-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/fe2bf67cf648/sensors-20-01056-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/bf67f6460340/sensors-20-01056-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/f75256e65a0a/sensors-20-01056-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/84337254ce18/sensors-20-01056-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/68927f71d09d/sensors-20-01056-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/a0e65d44404e/sensors-20-01056-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/a39e6a4c8c2d/sensors-20-01056-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/eae525faa70d/sensors-20-01056-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/4bdf9767622b/sensors-20-01056-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/ddba1a1c0656/sensors-20-01056-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/465948ce59ba/sensors-20-01056-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/9bdafeeaa12b/sensors-20-01056-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/f6845bda505a/sensors-20-01056-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/9f1f5832d3e7/sensors-20-01056-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/24bdef8a9199/sensors-20-01056-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/cbb788654c7b/sensors-20-01056-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/83f7211de59e/sensors-20-01056-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/4b947caa18ae/sensors-20-01056-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/fe2bf67cf648/sensors-20-01056-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/bf67f6460340/sensors-20-01056-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/f75256e65a0a/sensors-20-01056-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/84337254ce18/sensors-20-01056-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/68927f71d09d/sensors-20-01056-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/a0e65d44404e/sensors-20-01056-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/a39e6a4c8c2d/sensors-20-01056-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/eae525faa70d/sensors-20-01056-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/9d44/7070640/4bdf9767622b/sensors-20-01056-g018.jpg

相似文献

1
Employing Shadows for Multi-Person Tracking Based on a Single RGB-D Camera.基于单 RGB-D 相机的多人跟踪的阴影运用。
Sensors (Basel). 2020 Feb 15;20(4):1056. doi: 10.3390/s20041056.
2
Systematic Motion Integration with Multiple Depth Cameras Allowing Sensor Movement for Stable Skeleton Tracking.多深度相机的系统运动集成,允许传感器运动以实现稳定的骨骼跟踪。
Annu Int Conf IEEE Eng Med Biol Soc. 2022 Jul;2022:1801-1804. doi: 10.1109/EMBC48229.2022.9870876.
3
Person re-identification over camera networks using multi-task distance metric learning.基于多任务距离度量学习的摄像机网络中的人像再识别。
IEEE Trans Image Process. 2014 Aug;23(8):3656-70. doi: 10.1109/TIP.2014.2331755. Epub 2014 Jun 18.
4
Temporal and Spatial Denoising of Depth Maps.深度图的时空去噪
Sensors (Basel). 2015 Jul 29;15(8):18506-25. doi: 10.3390/s150818506.
5
Robust Fusion of Color and Depth Data for RGB-D Target Tracking Using Adaptive Range-Invariant Depth Models and Spatio-Temporal Consistency Constraints.基于自适应范围不变深度模型和时空一致性约束的 RGB-D 目标跟踪的彩色和深度数据的鲁棒融合。
IEEE Trans Cybern. 2018 Aug;48(8):2485-2499. doi: 10.1109/TCYB.2017.2740952. Epub 2017 Sep 6.
6
A general framework for tracking multiple people from a moving camera.从移动摄像机跟踪多个人的通用框架。
IEEE Trans Pattern Anal Mach Intell. 2013 Jul;35(7):1577-91. doi: 10.1109/TPAMI.2012.248.
7
A gait analysis method based on a depth camera for fall prevention.一种基于深度相机的用于预防跌倒的步态分析方法。
Annu Int Conf IEEE Eng Med Biol Soc. 2014;2014:4515-8. doi: 10.1109/EMBC.2014.6944627.
8
Real-Time Human Action Recognition with a Low-Cost RGB Camera and Mobile Robot Platform.基于低成本 RGB 相机和移动机器人平台的实时人体动作识别。
Sensors (Basel). 2020 May 19;20(10):2886. doi: 10.3390/s20102886.
9
Spatio-Temporal Calibration of Multiple Kinect Cameras Using 3D Human Pose.使用 3D 人体姿态进行多个 Kinect 相机的时空校准。
Sensors (Basel). 2022 Nov 17;22(22):8900. doi: 10.3390/s22228900.
10
Self-Supervised Multi-View Person Association and its Applications.自监督多视角人物关联及其应用。
IEEE Trans Pattern Anal Mach Intell. 2021 Aug;43(8):2794-2808. doi: 10.1109/TPAMI.2020.2974726. Epub 2021 Jul 1.

引用本文的文献

1
A Pruning Method for Deep Convolutional Network Based on Heat Map Generation Metrics.基于热图生成指标的深度卷积网络修剪方法。
Sensors (Basel). 2022 Mar 4;22(5):2022. doi: 10.3390/s22052022.
2
Multi-Object Tracking Algorithm for RGB-D Images Based on Asymmetric Dual Siamese Networks.基于非对称双孪生网络的 RGB-D 图像多目标跟踪算法。
Sensors (Basel). 2020 Nov 25;20(23):6745. doi: 10.3390/s20236745.

本文引用的文献

1
Leave-One-Out Kernel Optimization for Shadow Detection and Removal.基于核优化的逐样本剔除的阴影检测与去除。
IEEE Trans Pattern Anal Mach Intell. 2018 Mar;40(3):682-695. doi: 10.1109/TPAMI.2017.2691703. Epub 2017 Apr 6.
2
Exploiting long-term connectivity and visual motion in CRF-based multi-person tracking.基于条件随机场的多人跟踪中利用长期连通性和视觉运动。
IEEE Trans Image Process. 2014 Jul;23(7):3040-56. doi: 10.1109/TIP.2014.2324292.
3
Tracking multiple humans in complex situations.在复杂场景中跟踪多个人。
IEEE Trans Pattern Anal Mach Intell. 2004 Sep;26(9):1208-21. doi: 10.1109/TPAMI.2004.73.