• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于网络的增强现实中的环境感知渲染与交互

Environment-Aware Rendering and Interaction in Web-Based Augmented Reality.

作者信息

Ferrão José, Dias Paulo, Santos Beatriz Sousa, Oliveira Miguel

机构信息

Department of Electronics, Telecommunications, and Informatics (DETI), University of Aveiro, 3810-193 Aveiro, Portugal.

Intelligent System Associate Laboratory (LASI), Institute of Electronics and Informatics Engineering of Aveiro (IEETA), University of Aveiro, 3810-193 Aveiro, Portugal.

出版信息

J Imaging. 2023 Mar 8;9(3):63. doi: 10.3390/jimaging9030063.

DOI:10.3390/jimaging9030063
PMID:36976114
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10057055/
Abstract

This work presents a novel framework for web-based environment-aware rendering and interaction in augmented reality based on WebXR and three.js. It aims at accelerating the development of device-agnostic Augmented Reality (AR) applications. The solution allows for a realistic rendering of 3D elements, handles geometry occlusion, casts shadows of virtual objects onto real surfaces, and provides physics interaction with real-world objects. Unlike most existing state-of-the-art systems that are built to run on a specific hardware configuration, the proposed solution targets the web environment and is designed to work on a vast range of devices and configurations. Our solution can use monocular camera setups with depth data estimated by deep neural networks or, when available, use higher-quality depth sensors (e.g., LIDAR, structured light) that provide a more accurate perception of the environment. To ensure consistency in the rendering of the virtual scene a physically based rendering pipeline is used, in which physically correct attributes are associated with each 3D object, which, combined with lighting information captured by the device, enables the rendering of AR content matching the environment illumination. All these concepts are integrated and optimized into a pipeline capable of providing a fluid user experience even on middle-range devices. The solution is distributed as an open-source library that can be integrated into existing and new web-based AR projects. The proposed framework was evaluated and compared in terms of performance and visual features with two state-of-the-art alternatives.

摘要

这项工作提出了一个基于WebXR和three.js的用于增强现实中基于网络的环境感知渲染和交互的新颖框架。其目的是加速与设备无关的增强现实(AR)应用程序的开发。该解决方案允许对3D元素进行逼真的渲染,处理几何遮挡,将虚拟对象的阴影投射到真实表面上,并提供与现实世界对象的物理交互。与大多数现有最先进系统是为在特定硬件配置上运行而构建不同,所提出的解决方案以网络环境为目标,旨在在各种设备和配置上运行。我们的解决方案可以使用由深度神经网络估计深度数据的单目相机设置,或者在有可用的更高质量深度传感器(例如激光雷达、结构光)时使用它们,这些传感器能提供对环境更准确的感知。为确保虚拟场景渲染的一致性,使用了基于物理的渲染管道,其中物理正确的属性与每个3D对象相关联,这与设备捕获的光照信息相结合,能够渲染与环境光照相匹配的AR内容。所有这些概念都被集成并优化到一个即使在中端设备上也能提供流畅用户体验的管道中。该解决方案作为一个开源库发布,可以集成到现有的和新的基于网络的AR项目中。所提出的框架在性能和视觉特征方面与两个最先进的替代方案进行了评估和比较。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/97ca4cd2ee7a/jimaging-09-00063-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/728ec1bb12b5/jimaging-09-00063-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/9dc0073b5276/jimaging-09-00063-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/221491e27c61/jimaging-09-00063-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/f1cbfe050a05/jimaging-09-00063-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/e8163b6d0441/jimaging-09-00063-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/29b657b997bf/jimaging-09-00063-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/a7f28c6c59fc/jimaging-09-00063-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/ccbd3873008a/jimaging-09-00063-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/244bb4441082/jimaging-09-00063-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/fe8a7a0fc6f7/jimaging-09-00063-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/b7bb511790b8/jimaging-09-00063-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/6df49a9d6599/jimaging-09-00063-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/3bb9db0e1e4e/jimaging-09-00063-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/9f16c7c3b900/jimaging-09-00063-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/d92ebbab4f2e/jimaging-09-00063-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/ade2153bd9ea/jimaging-09-00063-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/ab4cc3bc3f25/jimaging-09-00063-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/e6071db16111/jimaging-09-00063-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/c2e60d678b86/jimaging-09-00063-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/b165784d03af/jimaging-09-00063-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/3e95e2e192e1/jimaging-09-00063-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/b9ecb0e50028/jimaging-09-00063-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/6a6e7f804eba/jimaging-09-00063-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/9fa866951e20/jimaging-09-00063-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/0b94cff1e7f2/jimaging-09-00063-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/97ca4cd2ee7a/jimaging-09-00063-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/728ec1bb12b5/jimaging-09-00063-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/9dc0073b5276/jimaging-09-00063-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/221491e27c61/jimaging-09-00063-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/f1cbfe050a05/jimaging-09-00063-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/e8163b6d0441/jimaging-09-00063-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/29b657b997bf/jimaging-09-00063-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/a7f28c6c59fc/jimaging-09-00063-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/ccbd3873008a/jimaging-09-00063-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/244bb4441082/jimaging-09-00063-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/fe8a7a0fc6f7/jimaging-09-00063-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/b7bb511790b8/jimaging-09-00063-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/6df49a9d6599/jimaging-09-00063-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/3bb9db0e1e4e/jimaging-09-00063-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/9f16c7c3b900/jimaging-09-00063-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/d92ebbab4f2e/jimaging-09-00063-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/ade2153bd9ea/jimaging-09-00063-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/ab4cc3bc3f25/jimaging-09-00063-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/e6071db16111/jimaging-09-00063-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/c2e60d678b86/jimaging-09-00063-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/b165784d03af/jimaging-09-00063-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/3e95e2e192e1/jimaging-09-00063-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/b9ecb0e50028/jimaging-09-00063-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/6a6e7f804eba/jimaging-09-00063-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/9fa866951e20/jimaging-09-00063-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/0b94cff1e7f2/jimaging-09-00063-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/dbf1/10057055/97ca4cd2ee7a/jimaging-09-00063-g026.jpg

相似文献

1
Environment-Aware Rendering and Interaction in Web-Based Augmented Reality.基于网络的增强现实中的环境感知渲染与交互
J Imaging. 2023 Mar 8;9(3):63. doi: 10.3390/jimaging9030063.
2
Long-Range Augmented Reality with Dynamic Occlusion Rendering.具有动态遮挡渲染的远程增强现实
IEEE Trans Vis Comput Graph. 2021 Nov;27(11):4236-4244. doi: 10.1109/TVCG.2021.3106434. Epub 2021 Oct 27.
3
SLAM-based dense surface reconstruction in monocular Minimally Invasive Surgery and its application to Augmented Reality.基于 SLAM 的单目微创手术中密集表面重建及其在增强现实中的应用。
Comput Methods Programs Biomed. 2018 May;158:135-146. doi: 10.1016/j.cmpb.2018.02.006. Epub 2018 Feb 8.
4
Multi-Resolution 3D Rendering for High-Performance Web AR.用于高性能网络增强现实的多分辨率3D渲染
Sensors (Basel). 2023 Aug 3;23(15):6885. doi: 10.3390/s23156885.
5
Physically-inspired Deep Light Estimation from a Homogeneous-Material Object for Mixed Reality Lighting.基于均质材料物体的物理启发式深度光照估计用于混合现实照明
IEEE Trans Vis Comput Graph. 2020 May;26(5):2002-2011. doi: 10.1109/TVCG.2020.2973050. Epub 2020 Feb 13.
6
Online tools to easily build virtual molecular models for display in augmented and virtual reality on the web.在线工具,可轻松构建虚拟分子模型,以便在网络上的增强现实和虚拟现实中显示。
J Mol Graph Model. 2022 Jul;114:108164. doi: 10.1016/j.jmgm.2022.108164. Epub 2022 Mar 17.
7
LivePhantom: Retrieving Virtual World Light Data to Real Environments.实时虚拟模型:将虚拟世界光照数据提取到真实环境中。
PLoS One. 2016 Dec 8;11(12):e0166424. doi: 10.1371/journal.pone.0166424. eCollection 2016.
8
A Real-Time Method for Inserting Virtual Objects Into Neural Radiance Fields.一种将虚拟对象插入神经辐射场的实时方法。
IEEE Trans Vis Comput Graph. 2025 Sep;31(9):4896-4907. doi: 10.1109/TVCG.2024.3422814.
9
Natural Environment Illumination: Coherent Interactive Augmented Reality for Mobile and Non-Mobile Devices.自然环境光照:移动和非移动设备的相干交互增强现实。
IEEE Trans Vis Comput Graph. 2017 Nov;23(11):2474-2484. doi: 10.1109/TVCG.2017.2734426. Epub 2017 Aug 10.
10
Developing the Next Generation of Augmented Reality Games for Pediatric Healthcare: An Open-Source Collaborative Framework Based on ARCore for Implementing Teaching, Training and Monitoring Applications.为儿科医疗保健开发下一代增强现实游戏:基于ARCore的开源协作框架,用于实施教学、培训和监测应用程序。
Sensors (Basel). 2021 Mar 7;21(5):1865. doi: 10.3390/s21051865.

本文引用的文献

1
A theory of frequency domain invariants: spherical harmonic identities for BRDF/lighting transfer and image consistency.频域不变量理论:用于双向反射分布函数/光照传递及图像一致性的球谐恒等式
IEEE Trans Pattern Anal Mach Intell. 2008 Feb;30(2):197-213. doi: 10.1109/TPAMI.2007.1162.
2
BRDF-shop: creating physically correct bidirectional reflectance distribution functions.双向反射分布函数库:创建符合物理原理的双向反射分布函数。
IEEE Comput Graph Appl. 2006 Jan-Feb;26(1):30-6. doi: 10.1109/mcg.2006.13.