Trouvé-Peloux P, Champagnat F, Le Besnerais G, Druart G, Idier J
J Opt Soc Am A Opt Image Sci Vis. 2021 Oct 1;38(10):1489-1500. doi: 10.1364/JOSAA.424621.
In this paper, we present a generic performance model able to evaluate the accuracy of depth estimation using depth from defocus (DFD). This model only requires the sensor point spread function at a given depth to evaluate the theoretical accuracy of depth estimation. Hence, it can be used for any (un)conventional system, using either one or several images. This model is validated experimentally on two unconventional DFD cameras, using either a coded aperture or a lens with chromatic aberration. Then, we use the proposed model for the end-to-end design of a 3D camera using an unconventional lens with chromatic aberration, for the specific use-case of small unmanned aerial vehicle navigation.
在本文中,我们提出了一种通用性能模型,该模型能够使用散焦深度(DFD)来评估深度估计的准确性。此模型仅需要给定深度处的传感器点扩散函数,即可评估深度估计的理论准确性。因此,它可用于任何(非)传统系统,无论是使用单幅图像还是多幅图像。该模型在两台非传统DFD相机上进行了实验验证,这两台相机分别使用了编码孔径或具有色差的镜头。然后,我们将所提出的模型用于使用具有色差的非传统镜头的3D相机的端到端设计,以用于小型无人机导航的特定用例。