• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种使用多分辨率变压器和两阶段特征融合的全色锐化网络。

A pan-sharpening network using multi-resolution transformer and two-stage feature fusion.

作者信息

Fan Wensheng, Liu Fan, Li Jingzhi

机构信息

College of Data Science, Taiyuan University of Technology, Jinzhong, Shanxi, China.

出版信息

PeerJ Comput Sci. 2023 Jul 28;9:e1488. doi: 10.7717/peerj-cs.1488. eCollection 2023.

DOI:10.7717/peerj-cs.1488
PMID:37547419
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10403166/
Abstract

Pan-sharpening is a fundamental and crucial task in the remote sensing image processing field, which generates a high-resolution multi-spectral image by fusing a low-resolution multi-spectral image and a high-resolution panchromatic image. Recently, deep learning techniques have shown competitive results in pan-sharpening. However, diverse features in the multi-spectral and panchromatic images are not fully extracted and exploited in existing deep learning methods, which leads to information loss in the pan-sharpening process. To solve this problem, a novel pan-sharpening method based on multi-resolution transformer and two-stage feature fusion is proposed in this article. Specifically, a transformer-based multi-resolution feature extractor is designed to extract diverse image features. Then, to fully exploit features with different content and characteristics, a two-stage feature fusion strategy is adopted. In the first stage, a multi-resolution fusion module is proposed to fuse multi-spectral and panchromatic features at each scale. In the second stage, a shallow-deep fusion module is proposed to fuse shallow and deep features for detail generation. Experiments over QuickBird and WorldView-3 datasets demonstrate that the proposed method outperforms current state-of-the-art approaches visually and quantitatively with fewer parameters. Moreover, the ablation study and feature map analysis also prove the effectiveness of the transformer-based multi-resolution feature extractor and the two-stage fusion scheme.

摘要

全色锐化是遥感图像处理领域一项基础且关键的任务,它通过融合低分辨率多光谱图像和高分辨率全色图像来生成高分辨率多光谱图像。近年来,深度学习技术在全色锐化方面展现出了具有竞争力的成果。然而,现有深度学习方法并未充分提取和利用多光谱图像与全色图像中的多样特征,这导致了全色锐化过程中的信息损失。为解决这一问题,本文提出了一种基于多分辨率变换器和两阶段特征融合的新型全色锐化方法。具体而言,设计了一种基于变换器的多分辨率特征提取器来提取多样的图像特征。然后,为充分利用具有不同内容和特征的特征,采用了两阶段特征融合策略。在第一阶段,提出了一个多分辨率融合模块,用于在每个尺度上融合多光谱和全色特征。在第二阶段,提出了一个浅-深融合模块,用于融合浅层和深层特征以生成细节。在QuickBird和WorldView-3数据集上进行的实验表明,所提出的方法在视觉和定量方面均优于当前的最先进方法,且参数更少。此外,消融研究和特征图分析也证明了基于变换器的多分辨率特征提取器和两阶段融合方案的有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/4f357cc4a7e0/peerj-cs-09-1488-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/3f861bff4337/peerj-cs-09-1488-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/855fd7ad1222/peerj-cs-09-1488-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/065f3b54923e/peerj-cs-09-1488-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/0892fc4f3c6b/peerj-cs-09-1488-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/debb24b0a856/peerj-cs-09-1488-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/ea17080d73ee/peerj-cs-09-1488-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/677b9961dd74/peerj-cs-09-1488-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/b4cd1fda5db7/peerj-cs-09-1488-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/cfcbf7048354/peerj-cs-09-1488-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/e865eb7abc0b/peerj-cs-09-1488-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/c9d50eb5de28/peerj-cs-09-1488-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/09774b7861ff/peerj-cs-09-1488-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/b7b2f68cef0f/peerj-cs-09-1488-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/eae51d126c80/peerj-cs-09-1488-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/4f357cc4a7e0/peerj-cs-09-1488-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/3f861bff4337/peerj-cs-09-1488-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/855fd7ad1222/peerj-cs-09-1488-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/065f3b54923e/peerj-cs-09-1488-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/0892fc4f3c6b/peerj-cs-09-1488-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/debb24b0a856/peerj-cs-09-1488-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/ea17080d73ee/peerj-cs-09-1488-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/677b9961dd74/peerj-cs-09-1488-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/b4cd1fda5db7/peerj-cs-09-1488-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/cfcbf7048354/peerj-cs-09-1488-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/e865eb7abc0b/peerj-cs-09-1488-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/c9d50eb5de28/peerj-cs-09-1488-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/09774b7861ff/peerj-cs-09-1488-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/b7b2f68cef0f/peerj-cs-09-1488-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/eae51d126c80/peerj-cs-09-1488-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1a4/10403166/4f357cc4a7e0/peerj-cs-09-1488-g015.jpg

相似文献

1
A pan-sharpening network using multi-resolution transformer and two-stage feature fusion.一种使用多分辨率变压器和两阶段特征融合的全色锐化网络。
PeerJ Comput Sci. 2023 Jul 28;9:e1488. doi: 10.7717/peerj-cs.1488. eCollection 2023.
2
Pansharpening Model of Transferable Remote Sensing Images Based on Feature Fusion and Attention Modules.基于特征融合和注意力模块的可迁移遥感图像融合模型。
Sensors (Basel). 2023 Mar 20;23(6):3275. doi: 10.3390/s23063275.
3
An IHS-Based Pan-Sharpening Method for Spectral Fidelity Improvement Using Ripplet Transform and Compressed Sensing.基于 IHS 的锐化方法,利用瑞利变换和压缩感知提高光谱保真度。
Sensors (Basel). 2018 Oct 25;18(11):3624. doi: 10.3390/s18113624.
4
A Triple-Double Convolutional Neural Network for Panchromatic Sharpening.一种用于全色锐化的三双卷积神经网络。
IEEE Trans Neural Netw Learn Syst. 2023 Nov;34(11):9088-9101. doi: 10.1109/TNNLS.2022.3155655. Epub 2023 Oct 27.
5
A novel pansharpening method based on cross stage partial network and transformer.一种基于跨阶段局部网络和变压器的新型全色锐化方法。
Sci Rep. 2024 Jun 2;14(1):12631. doi: 10.1038/s41598-024-63336-w.
6
Cascaded Convolutional Neural Network-Based Hyperspectral Image Resolution Enhancement via an Auxiliary Panchromatic Image.基于辅助全色图像的级联卷积神经网络超光谱图像分辨率增强
IEEE Trans Image Process. 2021;30:6815-6828. doi: 10.1109/TIP.2021.3098246. Epub 2021 Jul 30.
7
Artificial Intelligence-Based Deep Fusion Model for Pan-Sharpening of Remote Sensing Images.基于人工智能的遥感图像全色锐化深度融合模型。
Comput Intell Neurosci. 2021 Dec 23;2021:7615106. doi: 10.1155/2021/7615106. eCollection 2021.
8
Remote sensing for field pea yield estimation: A study of multi-scale data fusion approaches in phenomics.用于估算豌豆产量的遥感技术:表型组学中多尺度数据融合方法的研究
Front Plant Sci. 2023 Mar 3;14:1111575. doi: 10.3389/fpls.2023.1111575. eCollection 2023.
9
GAFnet: Group Attention Fusion Network for PAN and MS Image High-Resolution Classification.GAFnet:用于 PAN 和 MS 图像高分辨率分类的群组注意融合网络。
IEEE Trans Cybern. 2022 Oct;52(10):10556-10569. doi: 10.1109/TCYB.2021.3064571. Epub 2022 Sep 19.
10
TFormer: A throughout fusion transformer for multi-modal skin lesion diagnosis.TFormer:一种用于多模态皮肤病变诊断的全融合变压器。
Comput Biol Med. 2023 May;157:106712. doi: 10.1016/j.compbiomed.2023.106712. Epub 2023 Feb 28.

本文引用的文献

1
Full Scale Regression-Based Injection Coefficients for Panchromatic Sharpening.全尺度基于回归的多光谱锐化注入系数。
IEEE Trans Image Process. 2018 Jul;27(7):3418-3431. doi: 10.1109/TIP.2018.2819501.
2
Image Super-Resolution Using Deep Convolutional Networks.基于深度卷积网络的图像超分辨率重建。
IEEE Trans Pattern Anal Mach Intell. 2016 Feb;38(2):295-307. doi: 10.1109/TPAMI.2015.2439281.