• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

结构化光场中的通用相位深度映射。

Universal phase-depth mapping in a structured light field.

作者信息

Cai Zewei, Liu Xiaoli, Peng Xiang, Gao Bruce Z

出版信息

Appl Opt. 2018 Jan 1;57(1):A26-A32. doi: 10.1364/AO.57.000A26.

DOI:10.1364/AO.57.000A26
PMID:29328126
Abstract

Technologies and devices for light field imaging have recently been developed for both industrial applications and scientific research to achieve excellent imaging properties. In our previous work, we combined light field imaging with structured illumination to propose a structured light field method in which multidirectional depth estimation can be performed for high-quality 3D imaging. However, the projection axis was implicitly assumed to be perpendicular to the reference plane, which is hard to meet in practice. In this paper, we derive a universal phase-depth mapping in a structured light field by relaxing this implicit condition. Both nonlinear and linear models were proposed based on this universal relationship. To test the model's practical performance, we simulated experiments by adding errors to the real measured values to evaluate the deviation in depth estimation. By comparing the root-mean-square distributions of the depth deviations with respect to the depth positions, we demonstrated that the nonlinear model was precise and consistent in a wide range of depth, and we employed this model to realize high-quality multidirectional scene reconstruction.

摘要

用于光场成像的技术和设备最近已被开发用于工业应用和科学研究,以实现出色的成像特性。在我们之前的工作中,我们将光场成像与结构化照明相结合,提出了一种结构化光场方法,通过该方法可以进行多方向深度估计以实现高质量的三维成像。然而,投影轴被隐含地假定为垂直于参考平面,这在实际中很难满足。在本文中,我们通过放宽这个隐含条件,推导出结构化光场中的通用相位-深度映射。基于这种通用关系,我们提出了非线性和线性模型。为了测试模型的实际性能,我们通过向实际测量值添加误差来模拟实验,以评估深度估计中的偏差。通过比较深度偏差相对于深度位置的均方根分布,我们证明了非线性模型在很宽的深度范围内是精确且一致的,并且我们使用该模型实现了高质量的多方向场景重建。

相似文献

1
Universal phase-depth mapping in a structured light field.结构化光场中的通用相位深度映射。
Appl Opt. 2018 Jan 1;57(1):A26-A32. doi: 10.1364/AO.57.000A26.
2
Structured light field 3D imaging.结构化光场三维成像。
Opt Express. 2016 Sep 5;24(18):20324-34. doi: 10.1364/OE.24.020324.
3
Ray calibration and phase mapping for structured-light-field 3D reconstruction.用于结构光场三维重建的射线校准和相位映射。
Opt Express. 2018 Mar 19;26(6):7598-7613. doi: 10.1364/OE.26.007598.
4
Real-time depth controllable integral imaging pickup and reconstruction method with a light field camera.基于光场相机的实时深度可控积分成像采集与重建方法
Appl Opt. 2015 Dec 10;54(35):10333-41. doi: 10.1364/AO.54.010333.
5
Accurate depth estimation in structured light fields.结构光场中的精确深度估计。
Opt Express. 2019 Apr 29;27(9):13532-13546. doi: 10.1364/OE.27.013532.
6
Iterative reconstruction of scene depth with fidelity based on light field data.基于光场数据的具有保真度的场景深度迭代重建。
Appl Opt. 2017 Apr 10;56(11):3185-3192. doi: 10.1364/AO.56.003185.
7
3D reconstruction of structured light fields based on point cloud adaptive repair for highly reflective surfaces.基于点云自适应修复的高反射表面结构光场三维重建
Appl Opt. 2021 Aug 20;60(24):7086-7093. doi: 10.1364/AO.431538.
8
Three-Dimensional Reconstruction of Light Field Based on Phase Similarity.基于相位相似性的光场三维重建
Sensors (Basel). 2021 Nov 20;21(22):7734. doi: 10.3390/s21227734.
9
Flexible structured-light-based three-dimensional profile reconstruction method considering lens projection-imaging distortion.考虑透镜投影成像畸变的基于灵活结构光的三维轮廓重建方法
Appl Opt. 2012 May 1;51(13):2419-28. doi: 10.1364/AO.51.002419.
10
3D shape measurement based on structured light field imaging.基于结构光场成像的三维形状测量。
Math Biosci Eng. 2019 Oct 22;17(1):654-668. doi: 10.3934/mbe.2020034.

引用本文的文献

1
Multiperspective Light Field Reconstruction Method via Transfer Reinforcement Learning.基于迁移强化学习的多角度光场重建方法。
Comput Intell Neurosci. 2020 Feb 14;2020:8989752. doi: 10.1155/2020/8989752. eCollection 2020.