• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Stylizing Sparse-View 3D Scenes With Hierarchical Neural Representation.

作者信息

Wang Yifan, Gao Ang, Gong Yi, Zeng Yuan

出版信息

IEEE Trans Vis Comput Graph. 2025 Oct;31(10):7876-7889. doi: 10.1109/TVCG.2025.3558468.

DOI:10.1109/TVCG.2025.3558468
PMID:40193263
Abstract

3D scene stylization refers to generating stylized images of the scene at arbitrary novel view angles following a given set of style images while ensuring consistency when rendered from different views. Recently, several 3D style transfer methods leveraging the scene reconstruction capabilities of pre-trained neural radiance fields (NeRF) have been proposed. To successfully stylize a scene this way, one must first reconstruct a photo-realistic radiance field from collected images of the scene. However, when only sparse input views are available, pre-trained few-shot NeRFs often suffer from high-frequency artifacts, which are generated as a by-product of high-frequency details for improving reconstruction quality. Is it possible to generate more faithful stylized scenes from sparse inputs by directly optimizing encoding-based scene representation with target style? In this paper, we consider the stylization of sparse-view scenes in terms of disentangling content semantics and style textures. We propose a coarse-to-fine sparse-view scene stylization framework, where a novel hierarchical encoding-based neural representation is designed to generate high-quality stylized scenes directly from implicit scene representations. We also propose a new optimization strategy with content strength annealing to achieve realistic stylization and better content preservation. Extensive experiments demonstrate that our method can achieve high-quality stylization of sparse-view scenes and outperforms fine-tuning-based baselines in terms of stylization quality and efficiency.

摘要

相似文献

1
Stylizing Sparse-View 3D Scenes With Hierarchical Neural Representation.
IEEE Trans Vis Comput Graph. 2025 Oct;31(10):7876-7889. doi: 10.1109/TVCG.2025.3558468.
2
StylizedGS: Controllable Stylization for 3D Gaussian Splatting.
IEEE Trans Pattern Anal Mach Intell. 2025 Aug 28;PP. doi: 10.1109/TPAMI.2025.3604010.
3
MM-NeRF: Multimodal-Guided 3D Multi-Style Transfer of Neural Radiance Field.MM-NeRF:神经辐射场的多模态引导3D多风格转换
IEEE Trans Vis Comput Graph. 2025 Sep;31(9):5842-5853. doi: 10.1109/TVCG.2024.3476331.
4
Prescription of Controlled Substances: Benefits and Risks管制药品的处方:益处与风险
5
Learning Heterogeneous Mixture of Scene Experts for Large-Scale Neural Radiance Fields.
IEEE Trans Pattern Anal Mach Intell. 2025 Aug 27;PP. doi: 10.1109/TPAMI.2025.3603305.
6
UPST-NeRF: Universal Photorealistic Style Transfer of Neural Radiance Fields for 3D Scene.UPST-NeRF:用于3D场景的神经辐射场通用逼真风格迁移
IEEE Trans Vis Comput Graph. 2025 Apr;31(4):2045-2057. doi: 10.1109/TVCG.2024.3378692. Epub 2025 Feb 27.
7
Short-Term Memory Impairment短期记忆障碍
8
Sparse-view spectral CT reconstruction via a coupled subspace representation and score-based generative model.基于耦合子空间表示和基于分数的生成模型的稀疏视图光谱CT重建
Quant Imaging Med Surg. 2025 Jun 6;15(6):5474-5495. doi: 10.21037/qims-24-2226. Epub 2025 May 28.
9
Aspects of Genetic Diversity, Host Specificity and Public Health Significance of Single-Celled Intestinal Parasites Commonly Observed in Humans and Mostly Referred to as 'Non-Pathogenic'.人类常见且大多被称为“非致病性”的单细胞肠道寄生虫的遗传多样性、宿主特异性及公共卫生意义
APMIS. 2025 Sep;133(9):e70036. doi: 10.1111/apm.70036.
10
General 3D Vision-Language Model With Fast Rendering and Pre-Training Vision-Language Alignment.具有快速渲染和预训练视觉语言对齐的通用3D视觉语言模型。
IEEE Trans Pattern Anal Mach Intell. 2025 Sep;47(9):7352-7368. doi: 10.1109/TPAMI.2025.3566593.