Suppr超能文献

基于多尺度交互式融合网络的高光谱与合成孔径雷达图像分类

Hyperspectral and SAR Image Classification via Multiscale Interactive Fusion Network.

作者信息

Wang Junjie, Li Wei, Gao Yunhao, Zhang Mengmeng, Tao Ran, Du Qian

出版信息

IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):10823-10837. doi: 10.1109/TNNLS.2022.3171572. Epub 2023 Nov 30.

Abstract

Due to the limitations of single-source data, joint classification using multisource remote sensing data has received increasing attention. However, existing methods still have certain shortcomings when faced with feature extraction from single-source data and feature fusion between multisource data. In this article, a method based on multiscale interactive information extraction (MIFNet) for hyperspectral and synthetic aperture radar (SAR) image classification is proposed. First, a multiscale interactive information extraction (MIIE) block is designed to extract meaningful multiscale information. Compared with traditional multiscale models, it can not only obtain richer scale information but also reduce the model parameters and lower the network complexity. Furthermore, a global dependence fusion module (GDFM) is developed to fuse features from multisource data, which implements cross attention between multisource data from a global perspective and captures long-range dependence. Extensive experiments on the three datasets demonstrate the superiority of the proposed method and the necessity of each module for accuracy improvement.

摘要

由于单源数据的局限性,利用多源遥感数据进行联合分类受到了越来越多的关注。然而,现有方法在面对单源数据的特征提取和多源数据之间的特征融合时仍存在一定的不足。本文提出了一种基于多尺度交互信息提取(MIFNet)的高光谱和合成孔径雷达(SAR)图像分类方法。首先,设计了一个多尺度交互信息提取(MIIE)模块来提取有意义的多尺度信息。与传统的多尺度模型相比,它不仅可以获得更丰富的尺度信息,还可以减少模型参数,降低网络复杂度。此外,还开发了一个全局依赖融合模块(GDFM)来融合多源数据的特征,该模块从全局角度实现多源数据之间的交叉注意力,并捕捉长距离依赖关系。在三个数据集上进行的大量实验证明了所提方法的优越性以及每个模块对提高精度的必要性。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验