• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于光谱贡献和细节注入的光谱保持模型的全色锐化。

A spectral preserved model based on spectral contribution and dependence with detail injection for pansharpening.

机构信息

College of Mathematics and Computer, Xinyu University, Xinyu, 338004, China.

School of Economics and Management, Xinyu University, Xinyu, 338004, China.

出版信息

Sci Rep. 2023 Apr 27;13(1):6882. doi: 10.1038/s41598-023-33574-5.

DOI:10.1038/s41598-023-33574-5
PMID:37106003
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10140272/
Abstract

Pansharpening integrates the high spectral content of multispectral (MS) images and the fine spatial information of the corresponding panchromatic (PAN) images to produce a high spectral-spatial resolution image. Traditional pansharpening methods compensate for the spatial lack of the MS image using the PAN image details, which easily causes spectral distortion. To achieve spectral fidelity, a spectral preservation model based on spectral contribution and dempendence with detail injection for pansharpening is proposed. In the proposed model, first, an efficacy coefficient (CE) based on the spatial difference between the MS and PAN images is designed to suppress the impact of the detail injection on the spectra. Second, the spectral contribution and dependence (SCD) between the MS bands and pixels are considered to strengthen the internal adaptation of the spectra. Finally, a spectrally preserved model based on CE and SCD is designed to force the fused image fidelity in spectra when the MS image is pansharpened with the details of the PAN image. Experimental results show that the proposed model is effective.

摘要

融合处理将多光谱(MS)图像的高光谱含量和相应全色(PAN)图像的精细空间信息结合起来,生成高光谱空间分辨率图像。传统的融合处理方法使用 PAN 图像细节来补偿 MS 图像的空间不足,这容易导致光谱失真。为了实现光谱保真度,提出了一种基于光谱贡献和细节注入的光谱保持模型进行融合处理。在提出的模型中,首先,设计了一种基于 MS 和 PAN 图像之间空间差异的功效系数(CE),以抑制细节注入对光谱的影响。其次,考虑了 MS 波段和像素之间的光谱贡献和相关性(SCD),以增强光谱的内部适应性。最后,设计了一个基于 CE 和 SCD 的光谱保持模型,当用 PAN 图像的细节对 MS 图像进行融合处理时,该模型迫使融合图像在光谱上保持保真度。实验结果表明,所提出的模型是有效的。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/c3d31e72e819/41598_2023_33574_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/22adcd70bd10/41598_2023_33574_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/8d79b1b79a26/41598_2023_33574_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/4bc899a03e99/41598_2023_33574_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/00b33fdfa71e/41598_2023_33574_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/933ebf8ac82a/41598_2023_33574_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/c3d31e72e819/41598_2023_33574_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/22adcd70bd10/41598_2023_33574_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/8d79b1b79a26/41598_2023_33574_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/4bc899a03e99/41598_2023_33574_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/00b33fdfa71e/41598_2023_33574_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/933ebf8ac82a/41598_2023_33574_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/5ecc/10140272/c3d31e72e819/41598_2023_33574_Fig6_HTML.jpg

相似文献

1
A spectral preserved model based on spectral contribution and dependence with detail injection for pansharpening.基于光谱贡献和细节注入的光谱保持模型的全色锐化。
Sci Rep. 2023 Apr 27;13(1):6882. doi: 10.1038/s41598-023-33574-5.
2
Spectrum Correction Using Modeled Panchromatic Image for Pansharpening.使用建模全色图像进行光谱校正以实现图像锐化
J Imaging. 2020 Apr 6;6(4):20. doi: 10.3390/jimaging6040020.
3
An Adaptive Injection Model for Pansharpening.用于融合的自适应注入模型。
Comput Intell Neurosci. 2023 Jan 24;2023:4874974. doi: 10.1155/2023/4874974. eCollection 2023.
4
An Improved Pansharpening Method for Misaligned Panchromatic and Multispectral Data.一种针对未对齐全色和多光谱数据的改进型全色锐化方法。
Sensors (Basel). 2018 Feb 11;18(2):557. doi: 10.3390/s18020557.
5
Generative Dual-Adversarial Network With Spectral Fidelity and Spatial Enhancement for Hyperspectral Pansharpening.用于高光谱图像锐化的具有光谱保真度和空间增强的生成性双对抗网络
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):7303-7317. doi: 10.1109/TNNLS.2021.3084745. Epub 2022 Nov 30.
6
A Unified Pansharpening Model Based on Band-Adaptive Gradient and Detail Correction.一种基于波段自适应梯度和细节校正的统一全色锐化模型。
IEEE Trans Image Process. 2022;31:918-933. doi: 10.1109/TIP.2021.3137020. Epub 2022 Jan 6.
7
Variational Pansharpening for Hyperspectral Imagery Constrained by Spectral Shape and Gram⁻Schmidt Transformation.基于光谱形状和 Gram-Schmidt 变换约束的高光谱影像变分融合
Sensors (Basel). 2018 Dec 7;18(12):4330. doi: 10.3390/s18124330.
8
A New Pansharpening Method Based on Spatial and Spectral Sparsity Priors.一种基于空间和光谱稀疏先验的新型全色锐化方法。
IEEE Trans Image Process. 2014 Sep;23(9):4160-4174. doi: 10.1109/TIP.2014.2333661. Epub 2014 Jun 27.
9
FrMLNet: Framelet-Based Multilevel Network for Pansharpening.FrMLNet:基于帧的多级网络融合方法。
IEEE Trans Cybern. 2023 Jul;53(7):4594-4605. doi: 10.1109/TCYB.2021.3131651. Epub 2023 Jun 15.
10
Pansharpening with a Guided Filter Based on Three-Layer Decomposition.基于三层分解的引导滤波全色锐化
Sensors (Basel). 2016 Jul 12;16(7):1068. doi: 10.3390/s16071068.

引用本文的文献

1
A novel pansharpening method based on side window filter and new injection gain matrices.一种基于侧窗滤波器和新型注入增益矩阵的新型全色锐化方法。
Sci Rep. 2025 Jul 18;15(1):26052. doi: 10.1038/s41598-025-08929-9.
2
Bayesian decision based fusion algorithm for remote sensing images.基于贝叶斯决策的遥感图像融合算法
Sci Rep. 2024 May 21;14(1):11558. doi: 10.1038/s41598-024-60394-y.

本文引用的文献

1
The nonsubsampled contourlet transform: theory, design, and applications.非下采样轮廓波变换:理论、设计与应用
IEEE Trans Image Process. 2006 Oct;15(10):3089-101. doi: 10.1109/tip.2006.877507.