• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于纹理和色彩空间增强的生成对抗网络的遥感图像去雾

Remote sensing image dehazing using generative adversarial network with texture and color space enhancement.

作者信息

Shen Helin, Zhong Tie, Jia Yanfei, Wu Chunming

机构信息

Key Laboratory of Modern Power System Simulation and Control and Renewable Energy Technology (Ministry of Education), Department of Communication Engineering, College of Electric Engineering, Northeast Electric Power University, Jilin, 132012, China.

College of Electric Power Engineering, Beihua Univesity, Jilin, 132012, China.

出版信息

Sci Rep. 2024 May 29;14(1):12382. doi: 10.1038/s41598-024-63259-6.

DOI:10.1038/s41598-024-63259-6
PMID:38811675
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11137037/
Abstract

Remote sensing is gradually playing an important role in the detection of ground information. However, the quality of remote-sensing images has always suffered from unexpected natural conditions, such as intense haze phenomenon. Recently, convolutional neural networks (CNNs) have been applied to deal with dehazing problems, and some important findings have been obtained. Unfortunately, the performance of these classical CNN-based methods still needs further enhancement owing to their limited feature extraction capability. As a critical branch of CNNs, the generative adversarial network (GAN), composed of a generator and discriminator, has become a hot research topic and is considered a feasible approach to solving the dehazing problems. In this study, a novel dehazed generative adversarial network (GAN) is proposed to reconstruct the clean images from the hazy ones. For the generator network of the proposed GAN, the color and luminance feature extraction module and the high-frequency feature extraction module aim to extract multi-scale features and color space characteristics, which help the network to acquire texture, color, and luminance information. Meanwhile, a color loss function based on hue saturation value (HSV) is also proposed to enhance the performance in color recovery. For the discriminator network, a parallel structure is designed to enhance the extraction of texture and background information. Synthetic and real hazy images are used to check the performance of the proposed method. The experimental results demonstrate that the performance can significantly improve the image quality with a significant increment in peak-signal-to-noise ratio (PSNR). Compared with other popular methods, the dehazing results of the proposed method closely resemble haze-free images.

摘要

遥感技术在地面信息检测中逐渐发挥着重要作用。然而,遥感图像的质量一直受到意外自然条件的影响,如强烈的雾霾现象。近年来,卷积神经网络(CNN)已被应用于处理去雾问题,并取得了一些重要成果。不幸的是,由于这些基于经典CNN的方法特征提取能力有限,其性能仍需进一步提高。作为CNN的一个关键分支,由生成器和判别器组成的生成对抗网络(GAN)已成为一个热门研究课题,并被认为是解决去雾问题的一种可行方法。在本研究中,提出了一种新型的去雾生成对抗网络(GAN),用于从模糊图像中重建清晰图像。对于所提出的GAN的生成器网络,颜色和亮度特征提取模块以及高频特征提取模块旨在提取多尺度特征和颜色空间特征,这有助于网络获取纹理、颜色和亮度信息。同时,还提出了一种基于色调饱和度值(HSV)的颜色损失函数,以提高颜色恢复性能。对于判别器网络,设计了一种并行结构以增强纹理和背景信息的提取。使用合成和真实的模糊图像来检验所提方法的性能。实验结果表明,该方法的性能可以显著提高图像质量,峰值信噪比(PSNR)有显著提高。与其他流行方法相比,所提方法的去雾结果与无雾图像非常相似。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/f9f4a17b4daf/41598_2024_63259_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/618f0ca0b14c/41598_2024_63259_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/6fee810c892a/41598_2024_63259_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/4e6210103951/41598_2024_63259_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/92d28198de4c/41598_2024_63259_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/f819ee0f007c/41598_2024_63259_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/34f02190953c/41598_2024_63259_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/65849a62c428/41598_2024_63259_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/bd234209d1c4/41598_2024_63259_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/b56cc6c60adb/41598_2024_63259_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/c1ab6c21deaf/41598_2024_63259_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/60c851d78334/41598_2024_63259_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/6d30996f852b/41598_2024_63259_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/c2b55cb571ad/41598_2024_63259_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/8df4c376f762/41598_2024_63259_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/f9f4a17b4daf/41598_2024_63259_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/618f0ca0b14c/41598_2024_63259_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/6fee810c892a/41598_2024_63259_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/4e6210103951/41598_2024_63259_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/92d28198de4c/41598_2024_63259_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/f819ee0f007c/41598_2024_63259_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/34f02190953c/41598_2024_63259_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/65849a62c428/41598_2024_63259_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/bd234209d1c4/41598_2024_63259_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/b56cc6c60adb/41598_2024_63259_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/c1ab6c21deaf/41598_2024_63259_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/60c851d78334/41598_2024_63259_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/6d30996f852b/41598_2024_63259_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/c2b55cb571ad/41598_2024_63259_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/8df4c376f762/41598_2024_63259_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ef15/11137037/f9f4a17b4daf/41598_2024_63259_Fig15_HTML.jpg

相似文献

1
Remote sensing image dehazing using generative adversarial network with texture and color space enhancement.基于纹理和色彩空间增强的生成对抗网络的遥感图像去雾
Sci Rep. 2024 May 29;14(1):12382. doi: 10.1038/s41598-024-63259-6.
2
Remote Sensing Image Dehazing through an Unsupervised Generative Adversarial Network.通过无监督生成对抗网络实现遥感图像去雾
Sensors (Basel). 2023 Aug 28;23(17):7484. doi: 10.3390/s23177484.
3
Fusion of Heterogeneous Adversarial Networks for Single Image Dehazing.用于单图像去雾的异构对抗网络融合
IEEE Trans Image Process. 2020 Feb 28. doi: 10.1109/TIP.2020.2975986.
4
A feature-supervised generative adversarial network for environmental monitoring during hazy days.用于霾天环境监测的特征监督生成对抗网络。
Sci Total Environ. 2020 Dec 15;748:141445. doi: 10.1016/j.scitotenv.2020.141445. Epub 2020 Aug 8.
5
An Adversarial Dual-Branch Network for Nonhomogeneous Dehazing in Tunnel Construction.一种用于隧道施工中非均匀去雾的对抗性双分支网络。
Sensors (Basel). 2023 Nov 17;23(22):9245. doi: 10.3390/s23229245.
6
Low-light image enhancement using generative adversarial networks.使用生成对抗网络的低光照图像增强
Sci Rep. 2024 Aug 9;14(1):18489. doi: 10.1038/s41598-024-69505-1.
7
Gated Dehazing Network via Least Square Adversarial Learning.基于最小二乘对抗学习的门控去雾网络
Sensors (Basel). 2020 Nov 5;20(21):6311. doi: 10.3390/s20216311.
8
AgriGAN: unpaired image dehazing via a cycle-consistent generative adversarial network for the agricultural plant phenotype.农业生成对抗网络(AgriGAN):通过循环一致生成对抗网络实现农业植物表型的无配对图像去雾
Sci Rep. 2024 Jul 1;14(1):14994. doi: 10.1038/s41598-024-65540-0.
9
Deep Dehazing Network With Latent Ensembling Architecture and Adversarial Learning.具有潜在集成架构和对抗学习的深度去雾网络
IEEE Trans Image Process. 2021;30:1354-1368. doi: 10.1109/TIP.2020.3044208. Epub 2020 Dec 23.
10
Multi-Scale Attention Feature Enhancement Network for Single Image Dehazing.用于单图像去雾的多尺度注意力特征增强网络
Sensors (Basel). 2023 Sep 27;23(19):8102. doi: 10.3390/s23198102.

引用本文的文献

1
DFFNet: A Dual-Domain Feature Fusion Network for Single Remote Sensing Image Dehazing.DFFNet:一种用于单幅遥感图像去雾的双域特征融合网络。
Sensors (Basel). 2025 Aug 18;25(16):5125. doi: 10.3390/s25165125.
2
Remote sensing image dehazing using a wavelet-based generative adversarial networks.基于小波的生成对抗网络的遥感图像去雾
Sci Rep. 2025 Jan 29;15(1):3634. doi: 10.1038/s41598-025-87240-z.
3
Generative adversarial networks with texture recovery and physical constraints for remote sensing image dehazing.具有纹理恢复和物理约束的生成对抗网络用于遥感图像去雾

本文引用的文献

1
Semi-Supervised Domain Alignment Learning for Single Image Dehazing.用于单图像去雾的半监督域对齐学习
IEEE Trans Cybern. 2023 Nov;53(11):7238-7250. doi: 10.1109/TCYB.2022.3221544. Epub 2023 Oct 17.
2
Visibility Enhancement and Fog Detection: Solutions Presented in Recent Scientific Papers with Potential for Application to Mobile Systems.可见度增强和雾检测:近期科学文献中提出的适用于移动系统的解决方案。
Sensors (Basel). 2021 May 12;21(10):3370. doi: 10.3390/s21103370.
3
RefineDNet: A Weakly Supervised Refinement Framework for Single Image Dehazing.
Sci Rep. 2024 Dec 28;14(1):31426. doi: 10.1038/s41598-024-83088-x.
RefineDNet:一种用于单图像去雾的弱监督细化框架。
IEEE Trans Image Process. 2021;30:3391-3404. doi: 10.1109/TIP.2021.3060873. Epub 2021 Mar 9.