• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于统一深度边界失真模型的深度图恢复

Depth Map Recovery Based on a Unified Depth Boundary Distortion Model.

作者信息

Wang Haotian, Yang Meng, Lan Xuguang, Zhu Ce, Zheng Nanning

出版信息

IEEE Trans Image Process. 2022;31:7020-7035. doi: 10.1109/TIP.2022.3216768. Epub 2022 Nov 14.

DOI:10.1109/TIP.2022.3216768
PMID:36331641
Abstract

Depth maps acquired by either physical sensors or learning methods are often seriously distorted due to boundary distortion problems, including missing, fake, and misaligned boundaries (compared with RGB images). An RGB-guided depth map recovery method is proposed in this paper to recover true boundaries in seriously distorted depth maps. Therefore, a unified model is first developed to observe all these kinds of distorted boundaries in depth maps. Observing distorted boundaries is equivalent to identifying erroneous regions in distorted depth maps, because depth boundaries are essentially formed by contiguous regions with different intensities. Then, erroneous regions are identified by separately extracting local structures of RGB image and depth map with Gaussian kernels and comparing their similarity on the basis of the SSIM index. A depth map recovery method is then proposed on the basis of the unified model. This method recovers true depth boundaries by iteratively identifying and correcting erroneous regions in recovered depth map based on the unified model and a weighted median filter. Because RGB image generally includes additional textural contents compared with depth maps, texture-copy artifacts problem is further addressed in the proposed method by restricting the model works around depth boundaries in each iteration. Extensive experiments are conducted on five RGB-depth datasets including depth map recovery, depth super-resolution, depth estimation enhancement, and depth completion enhancement. The results demonstrate that the proposed method considerably improves both the quantitative and visual qualities of recovered depth maps in comparison with fifteen competitive methods. Most object boundaries in recovered depth maps are corrected accurately, and kept sharply and well aligned with the ones in RGB images.

摘要

通过物理传感器或学习方法获取的深度图,由于边界失真问题(包括缺失、虚假和未对齐的边界,与RGB图像相比),常常会严重失真。本文提出一种RGB引导的深度图恢复方法,以恢复严重失真的深度图中的真实边界。因此,首先开发了一个统一模型来观察深度图中所有这些类型的失真边界。观察失真边界等同于识别失真深度图中的错误区域,因为深度边界本质上是由具有不同强度的相邻区域形成的。然后,通过使用高斯核分别提取RGB图像和深度图的局部结构,并基于结构相似性(SSIM)指数比较它们的相似性,来识别错误区域。在此统一模型的基础上,提出了一种深度图恢复方法。该方法基于统一模型和加权中值滤波器,通过迭代识别和校正恢复深度图中的错误区域,来恢复真实的深度边界。由于与深度图相比,RGB图像通常包含额外的纹理内容,因此在所提出的方法中,通过限制模型在每次迭代中围绕深度边界工作,进一步解决了纹理复制伪影问题。在五个RGB-深度数据集上进行了广泛的实验,包括深度图恢复、深度超分辨率、深度估计增强和深度完成增强。结果表明,与十五种竞争方法相比,所提出的方法显著提高了恢复深度图的定量和视觉质量。恢复深度图中的大多数物体边界都得到了准确校正,并且与RGB图像中的边界保持清晰且良好对齐。

相似文献

1
Depth Map Recovery Based on a Unified Depth Boundary Distortion Model.基于统一深度边界失真模型的深度图恢复
IEEE Trans Image Process. 2022;31:7020-7035. doi: 10.1109/TIP.2022.3216768. Epub 2022 Nov 14.
2
RGB-Guided Depth Map Recovery by Two-Stage Coarse-to-Fine Dense CRF Models.基于两阶段粗到细密集条件随机场模型的RGB引导深度图恢复
IEEE Trans Image Process. 2023;32:1315-1328. doi: 10.1109/TIP.2023.3242144. Epub 2023 Feb 23.
3
Efficient Depth Enhancement Using a Combination of Color and Depth Information.结合颜色和深度信息实现高效深度增强
Sensors (Basel). 2017 Jul 1;17(7):1544. doi: 10.3390/s17071544.
4
A Comprehensive Survey of Depth Completion Approaches.深度完成方法综述。
Sensors (Basel). 2022 Sep 14;22(18):6969. doi: 10.3390/s22186969.
5
Recent Advances in Conventional and Deep Learning-Based Depth Completion: A Survey.基于传统方法和深度学习的深度补全研究进展综述
IEEE Trans Neural Netw Learn Syst. 2024 Mar;35(3):3395-3415. doi: 10.1109/TNNLS.2022.3201534. Epub 2024 Feb 29.
6
RDFC-GAN: RGB-Depth Fusion CycleGAN for Indoor Depth Completion.RDFC-GAN:用于室内深度补全的RGB-深度融合循环生成对抗网络
IEEE Trans Pattern Anal Mach Intell. 2024 Nov;46(11):7088-7101. doi: 10.1109/TPAMI.2024.3388004. Epub 2024 Oct 3.
7
Edge Preserving and Multi-Scale Contextual Neural Network for Salient Object Detection.边缘保持和多尺度上下文神经网络的显著目标检测。
IEEE Trans Image Process. 2018;27(1):121-134. doi: 10.1109/TIP.2017.2756825.
8
Depth Map Upsampling via Multi-Modal Generative Adversarial Network.通过多模态生成对抗网络进行深度图上采样
Sensors (Basel). 2019 Apr 2;19(7):1587. doi: 10.3390/s19071587.
9
Temporal and Spatial Denoising of Depth Maps.深度图的时空去噪
Sensors (Basel). 2015 Jul 29;15(8):18506-25. doi: 10.3390/s150818506.
10
Edge-Preserving Depth Map Upsampling by Joint Trilateral Filter.基于联合三边滤波的边缘保持深度图上采样。
IEEE Trans Cybern. 2018 Jan;48(1):371-384. doi: 10.1109/TCYB.2016.2637661. Epub 2017 Jan 24.