• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种基于细节注入与冗余减少的多阶段渐进式全色锐化网络。

A Multi-Stage Progressive Pansharpening Network Based on Detail Injection with Redundancy Reduction.

作者信息

Wen Xincan, Ma Hongbing, Li Liangliang

机构信息

School of Computer Science and Technology, Xinjiang University, Urumqi 830046, China.

Key Laboratory of Signal Detection and Processing, Xinjiang University, Urumqi 830046, China.

出版信息

Sensors (Basel). 2024 Sep 18;24(18):6039. doi: 10.3390/s24186039.

DOI:10.3390/s24186039
PMID:39338784
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11435471/
Abstract

In the field of remote sensing image processing, pansharpening technology stands as a critical advancement. This technology aims to enhance multispectral images that possess low resolution by integrating them with high-spatial-resolution panchromatic images, ultimately producing multispectral images with high resolution that are abundant in both spatial and spectral details. Thus, there remains potential for improving the quality of both the spectral and spatial domains of the fused images based on deep-learning-based pansharpening methods. This work proposes a new method for the task of pansharpening: the Multi-Stage Progressive Pansharpening Network with Detail Injection with Redundancy Reduction Mechanism (MSPPN-DIRRM). This network is divided into three levels, each of which is optimized for the extraction of spectral and spatial data at different scales. Particular spectral feature and spatial detail extraction modules are used at each stage. Moreover, a new image reconstruction module named the DRRM is introduced in this work; it eliminates both spatial and channel redundancy and improves the fusion quality. The effectiveness of the proposed model is further supported by experimental results using both simulated data and real data from the QuickBird, GaoFen1, and WorldView2 satellites; these results show that the proposed model outperforms deep-learning-based methods in both visual and quantitative assessments. Among various evaluation metrics, performance improves by 0.92-18.7% compared to the latest methods.

摘要

在遥感图像处理领域,全色锐化技术是一项关键进展。该技术旨在通过将低分辨率的多光谱图像与高空间分辨率的全色图像进行融合,来增强多光谱图像,最终生成在空间和光谱细节上都很丰富的高分辨率多光谱图像。因此,基于深度学习的全色锐化方法在融合图像的光谱和空间域质量提升方面仍有潜力。这项工作提出了一种用于全色锐化任务的新方法:具有细节注入和冗余减少机制的多阶段渐进全色锐化网络(MSPPN-DIRRM)。该网络分为三个层次,每个层次针对不同尺度的光谱和空间数据提取进行了优化。每个阶段都使用了特定的光谱特征和空间细节提取模块。此外,这项工作还引入了一个名为DRRM的新图像重建模块;它消除了空间和通道冗余,提高了融合质量。使用QuickBird、高分1号和WorldView2卫星的模拟数据和真实数据进行的实验结果进一步支持了所提出模型的有效性;这些结果表明,所提出的模型在视觉和定量评估方面均优于基于深度学习的方法。在各种评估指标中,与最新方法相比,性能提高了0.92%-18.7%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/f85a28427339/sensors-24-06039-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/9a5034ea47f2/sensors-24-06039-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/1faabbf89b12/sensors-24-06039-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/fe985a5be0c8/sensors-24-06039-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/aa6ee5258965/sensors-24-06039-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/913840769e53/sensors-24-06039-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/23af8fedd042/sensors-24-06039-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/09743c215934/sensors-24-06039-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/d0971235a248/sensors-24-06039-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/beea1e618573/sensors-24-06039-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/f70444c9f8d7/sensors-24-06039-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/e39400a86eda/sensors-24-06039-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/c0f08bfc6c13/sensors-24-06039-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/9a58cf0993f4/sensors-24-06039-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/d596623c81c4/sensors-24-06039-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/cefd5a5de8c7/sensors-24-06039-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/897cacaf11b6/sensors-24-06039-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/f002722232bf/sensors-24-06039-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/f85a28427339/sensors-24-06039-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/9a5034ea47f2/sensors-24-06039-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/1faabbf89b12/sensors-24-06039-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/fe985a5be0c8/sensors-24-06039-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/aa6ee5258965/sensors-24-06039-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/913840769e53/sensors-24-06039-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/23af8fedd042/sensors-24-06039-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/09743c215934/sensors-24-06039-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/d0971235a248/sensors-24-06039-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/beea1e618573/sensors-24-06039-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/f70444c9f8d7/sensors-24-06039-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/e39400a86eda/sensors-24-06039-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/c0f08bfc6c13/sensors-24-06039-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/9a58cf0993f4/sensors-24-06039-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/d596623c81c4/sensors-24-06039-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/cefd5a5de8c7/sensors-24-06039-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/897cacaf11b6/sensors-24-06039-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/f002722232bf/sensors-24-06039-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/494c/11435471/f85a28427339/sensors-24-06039-g018.jpg

相似文献

1
A Multi-Stage Progressive Pansharpening Network Based on Detail Injection with Redundancy Reduction.一种基于细节注入与冗余减少的多阶段渐进式全色锐化网络。
Sensors (Basel). 2024 Sep 18;24(18):6039. doi: 10.3390/s24186039.
2
VOGTNet: Variational Optimization-Guided Two-Stage Network for Multispectral and Panchromatic Image Fusion.VOGTNet:用于多光谱和全色图像融合的变分优化引导两阶段网络
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):9268-9282. doi: 10.1109/TNNLS.2024.3409563. Epub 2025 May 2.
3
Pansharpening Model of Transferable Remote Sensing Images Based on Feature Fusion and Attention Modules.基于特征融合和注意力模块的可迁移遥感图像融合模型。
Sensors (Basel). 2023 Mar 20;23(6):3275. doi: 10.3390/s23063275.
4
A New Pansharpening Method Based on Spatial and Spectral Sparsity Priors.一种基于空间和光谱稀疏先验的新型全色锐化方法。
IEEE Trans Image Process. 2014 Sep;23(9):4160-4174. doi: 10.1109/TIP.2014.2333661. Epub 2014 Jun 27.
5
FrMLNet: Framelet-Based Multilevel Network for Pansharpening.FrMLNet:基于帧的多级网络融合方法。
IEEE Trans Cybern. 2023 Jul;53(7):4594-4605. doi: 10.1109/TCYB.2021.3131651. Epub 2023 Jun 15.
6
Generative Dual-Adversarial Network With Spectral Fidelity and Spatial Enhancement for Hyperspectral Pansharpening.用于高光谱图像锐化的具有光谱保真度和空间增强的生成性双对抗网络
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7303-7317. doi: 10.1109/TNNLS.2021.3084745. Epub 2022 Nov 30.
7
A spectral preserved model based on spectral contribution and dependence with detail injection for pansharpening.基于光谱贡献和细节注入的光谱保持模型的全色锐化。
Sci Rep. 2023 Apr 27;13(1):6882. doi: 10.1038/s41598-023-33574-5.
8
Pansharpening with a Guided Filter Based on Three-Layer Decomposition.基于三层分解的引导滤波全色锐化
Sensors (Basel). 2016 Jul 12;16(7):1068. doi: 10.3390/s16071068.
9
Deep Variational Network for Blind Pansharpening.用于盲 pansharpening 的深度变分网络。
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):9283-9297. doi: 10.1109/TNNLS.2024.3436850. Epub 2025 May 2.
10
An Adaptive Injection Model for Pansharpening.用于融合的自适应注入模型。
Comput Intell Neurosci. 2023 Jan 24;2023:4874974. doi: 10.1155/2023/4874974. eCollection 2023.

本文引用的文献

1
Squeeze-and-Excitation Networks.挤压激励网络。
IEEE Trans Pattern Anal Mach Intell. 2020 Aug;42(8):2011-2023. doi: 10.1109/TPAMI.2019.2913372. Epub 2019 Apr 29.
2
Fast and Accurate Image Super-Resolution with Deep Laplacian Pyramid Networks.基于深度拉普拉斯金字塔网络的快速准确图像超分辨率
IEEE Trans Pattern Anal Mach Intell. 2019 Nov;41(11):2599-2613. doi: 10.1109/TPAMI.2018.2865304. Epub 2018 Aug 13.
3
Image Super-Resolution Using Deep Convolutional Networks.基于深度卷积网络的图像超分辨率重建。
IEEE Trans Pattern Anal Mach Intell. 2016 Feb;38(2):295-307. doi: 10.1109/TPAMI.2015.2439281.