Suppr超能文献

CrossFuNet:用于手部姿势估计的 RGB 和深度交叉融合网络。

CrossFuNet: RGB and Depth Cross-Fusion Network for Hand Pose Estimation.

机构信息

College of Information, Mechanical and Electrical Engineering, Shanghai Normal University, Shanghai 200234, China.

Shenzhen Guangjian Technology Company Ltd., Shanghai 200135, China.

出版信息

Sensors (Basel). 2021 Sep 11;21(18):6095. doi: 10.3390/s21186095.

Abstract

Despite recent successes in hand pose estimation from RGB images or depth maps, inherent challenges remain. RGB-based methods suffer from heavy self-occlusions and depth ambiguity. Depth sensors rely heavily on distance and can only be used indoors, thus there are many limitations to the practical application of depth-based methods. The aforementioned challenges have inspired us to combine the two modalities to offset the shortcomings of the other. In this paper, we propose a novel RGB and depth information fusion network to improve the accuracy of 3D hand pose estimation, which is called CrossFuNet. Specifically, the RGB image and the paired depth map are input into two different subnetworks, respectively. The feature maps are fused in the fusion module in which we propose a completely new approach to combine the information from the two modalities. Then, the common method is used to regress the 3D key-points by heatmaps. We validate our model on two public datasets and the results reveal that our model outperforms the state-of-the-art methods.

摘要

尽管在基于 RGB 图像或深度图的手部姿态估计方面取得了一些最新进展,但仍然存在一些固有挑战。基于 RGB 的方法存在严重的自遮挡和深度歧义问题。深度传感器严重依赖距离,且只能在室内使用,因此基于深度的方法在实际应用中存在诸多限制。上述挑战促使我们结合两种模态,以弥补其他模态的不足。在本文中,我们提出了一种新颖的 RGB 和深度信息融合网络,以提高 3D 手部姿态估计的准确性,称为 CrossFuNet。具体来说,将 RGB 图像和配对的深度图分别输入到两个不同的子网络中。在融合模块中融合特征图,我们提出了一种全新的方法来结合两种模态的信息。然后,通过热图回归 3D 关键点。我们在两个公共数据集上验证了我们的模型,结果表明我们的模型优于最先进的方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a703/8473363/dbfb008c93d8/sensors-21-06095-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验