• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于时空相关超分辨率的光场图像三维重建。

3D reconstruction of light-field images based on spatiotemporal correlation super-resolution.

出版信息

Appl Opt. 2023 Apr 20;62(12):3016-3027. doi: 10.1364/AO.484909.

DOI:10.1364/AO.484909
PMID:37133148
Abstract

In this paper, we make full advantage of the information correlation of subaperture images and propose a new super-resolution (SR) reconstruction method based on spatiotemporal correlation to achieve SR reconstruction for light-field images. Meanwhile, the offset compensation method based on optical flow and spatial transformer network is designed to realize accurate compensation between adjacent light-field subaperture images. After that, the obtained light-field images with high resolution are combined with the self-designed system based on phase similarity and SR reconstruction to realize accurate 3D reconstruction of a structured light field. Finally, experimental results demonstrate the validity of the proposed method to perform accurate 3D reconstruction of light-field images from the SR data. Generally, our method makes full use of the redundant information between different subaperture images, hides the upsampling process in the convolution, provides more sufficient information, and reduces time-consuming procedures, which is more efficient to realize the accurate 3D reconstruction of light-field images.

摘要

在本文中,我们充分利用子孔径图像的信息相关性,提出了一种新的基于时空相关性的超分辨率(SR)重建方法,以实现光场图像的 SR 重建。同时,设计了基于光流和空间变换网络的偏移补偿方法,以实现相邻光场子孔径图像之间的精确补偿。之后,将获得的高分辨率光场图像与基于相位相似性和 SR 重建的自设计系统相结合,实现结构光场的精确 3D 重建。最后,实验结果验证了该方法从 SR 数据准确重建光场图像的有效性。总的来说,我们的方法充分利用了不同子孔径图像之间的冗余信息,将上采样过程隐藏在卷积中,提供了更充分的信息,减少了耗时的步骤,更高效地实现了光场图像的精确 3D 重建。

相似文献

1
3D reconstruction of light-field images based on spatiotemporal correlation super-resolution.基于时空相关超分辨率的光场图像三维重建。
Appl Opt. 2023 Apr 20;62(12):3016-3027. doi: 10.1364/AO.484909.
2
3D reconstruction of structured light fields based on point cloud adaptive repair for highly reflective surfaces.基于点云自适应修复的高反射表面结构光场三维重建
Appl Opt. 2021 Aug 20;60(24):7086-7093. doi: 10.1364/AO.431538.
3
Real-time optical reconstruction for a three-dimensional light-field display based on path-tracing and CNN super-resolution.基于路径追踪和卷积神经网络超分辨率的三维光场显示实时光学重建
Opt Express. 2021 Nov 8;29(23):37862-37876. doi: 10.1364/OE.441714.
4
Technical Note: Real-time 3D MRI in the presence of motion for MRI-guided radiotherapy: 3D Dynamic keyhole imaging with super-resolution.技术说明:运动中实时 3D MRI 在 MRI 引导放疗中的应用:具有超分辨率的 3D 动态关键孔成像。
Med Phys. 2019 Oct;46(10):4631-4638. doi: 10.1002/mp.13748. Epub 2019 Aug 27.
5
Edge-enhanced infrared image super-resolution reconstruction model under transformer.基于Transformer的边缘增强红外图像超分辨率重建模型
Sci Rep. 2024 Jul 6;14(1):15585. doi: 10.1038/s41598-024-66302-8.
6
Fusing multi-scale information in convolution network for MR image super-resolution reconstruction.在卷积网络中融合多尺度信息进行磁共振图像超分辨率重建。
Biomed Eng Online. 2018 Aug 25;17(1):114. doi: 10.1186/s12938-018-0546-9.
7
Light Field Image Super-Resolution Using Deformable Convolution.使用可变形卷积的光场图像超分辨率
IEEE Trans Image Process. 2021;30:1057-1071. doi: 10.1109/TIP.2020.3042059. Epub 2020 Dec 11.
8
3D MRI Reconstruction Based on 2D Generative Adversarial Network Super-Resolution.基于二维生成对抗网络超分辨率的 3D MRI 重建。
Sensors (Basel). 2021 Apr 23;21(9):2978. doi: 10.3390/s21092978.
9
3D dense convolutional neural network for fast and accurate single MR image super-resolution.用于快速准确的单张磁共振图像超分辨率的 3D 密集卷积神经网络。
Comput Med Imaging Graph. 2021 Oct;93:101973. doi: 10.1016/j.compmedimag.2021.101973. Epub 2021 Aug 20.
10
Super-resolution imaging for infrared micro-scanning optical system.用于红外微扫描光学系统的超分辨率成像
Opt Express. 2019 Mar 4;27(5):7719-7737. doi: 10.1364/OE.27.007719.