• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

GPS-Gaussian+: Generalizable Pixel-wise 3D Gaussian Splatting for Real-Time Human-Scene Rendering from Sparse Views.

作者信息

Zhou Boyao, Zheng Shunyuan, Tu Hanzhang, Shao Ruizhi, Liu Boning, Zhang Shengping, Nie Liqiang, Liu Yebin

出版信息

IEEE Trans Pattern Anal Mach Intell. 2025 Apr 15;PP. doi: 10.1109/TPAMI.2025.3561248.

DOI:10.1109/TPAMI.2025.3561248
PMID:40232901
Abstract

Differentiable rendering techniques have recently shown promising results for free-viewpoint video synthesis of characters. However, such methods, either Gaussian Splatting or neural implicit rendering, typically necessitate per-subject optimization which does not meet the requirement of real-time rendering in an interactive application. We propose a generalizable Gaussian Splatting approach for high-resolution image rendering under a sparse-view camera setting. To this end, we introduce Gaussian parameter maps defined on the source views and directly regress Gaussian properties for instant novel view synthesis without any fine-tuning or optimization. We train our Gaussian parameter regression module on human-only data or human-scene data, jointly with a depth estimation module to lift 2D parameter maps to 3D space. The proposed framework is fully differentiable with both depth and rendering supervision or with only rendering supervision. We further introduce a regularization term and an epipolar attention mechanism to preserve geometry consistency between two source views, especially when neglecting depth supervision. Experiments on several datasets demonstrate that our method outperforms state-of-the-art methods while achieving an exceeding rendering speed. Our project page is available at https://yaourtb.github.io/GPS-Gaussian+.

摘要

相似文献

1
GPS-Gaussian+: Generalizable Pixel-wise 3D Gaussian Splatting for Real-Time Human-Scene Rendering from Sparse Views.
IEEE Trans Pattern Anal Mach Intell. 2025 Apr 15;PP. doi: 10.1109/TPAMI.2025.3561248.
2
PGSR: Planar-Based Gaussian Splatting for Efficient and High-Fidelity Surface Reconstruction.PGSR:基于平面的高斯点云绘制,用于高效且高保真的曲面重建。
IEEE Trans Vis Comput Graph. 2025 Sep;31(9):6100-6111. doi: 10.1109/TVCG.2024.3494046.
3
MPGS: Multi-Plane Gaussian Splatting for Compact Scenes Rendering.MPGS:用于紧凑场景渲染的多平面高斯点渲染
IEEE Trans Vis Comput Graph. 2025 May;31(5):3256-3266. doi: 10.1109/TVCG.2025.3549551. Epub 2025 Apr 25.
4
SplatLoc: 3D Gaussian Splatting-based Visual Localization for Augmented Reality.SplatLoc:用于增强现实的基于3D高斯点云的视觉定位
IEEE Trans Vis Comput Graph. 2025 May;31(5):3591-3601. doi: 10.1109/TVCG.2025.3549563. Epub 2025 Apr 25.
5
Arbitrary Optics for Gaussian Splatting Using Space Warping.
J Imaging. 2024 Dec 22;10(12):330. doi: 10.3390/jimaging10120330.
6
Frequency-Aware Uncertainty Gaussian Splatting for Dynamic Scene Reconstruction.
IEEE Trans Vis Comput Graph. 2025 May;31(5):3558-3568. doi: 10.1109/TVCG.2025.3549143. Epub 2025 Apr 25.
7
Fov-GS: Foveated 3D Gaussian Splatting for Dynamic Scenes.
IEEE Trans Vis Comput Graph. 2025 May;31(5):2975-2985. doi: 10.1109/TVCG.2025.3549576. Epub 2025 Apr 25.
8
Look at the Sky: Sky-Aware Efficient 3D Gaussian Splatting in the Wild.
IEEE Trans Vis Comput Graph. 2025 May;31(5):3481-3491. doi: 10.1109/TVCG.2025.3549187. Epub 2025 Apr 25.
9
Real-Time High-Resolution View Synthesis of Complex Scenes With Explicit 3D Visibility Reasoning.基于显式3D可见性推理的复杂场景实时高分辨率视图合成
IEEE Trans Vis Comput Graph. 2025 Sep;31(9):6178-6189. doi: 10.1109/TVCG.2024.3499874.
10
Octree-GS: Towards Consistent Real-time Rendering with LOD-Structured 3D Gaussians.八叉树高斯曲面法:迈向基于层次细节结构三维高斯曲面的一致实时渲染
IEEE Trans Pattern Anal Mach Intell. 2025 May 8;PP. doi: 10.1109/TPAMI.2025.3568201.