• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于数据驱动的 SAR 数据聚焦方法。

A Data-Driven Approach to SAR Data-Focusing.

机构信息

DEI-Department of Electrical and Information Engineering, Politecnico di Bari, 70126 Bari, Italy.

STIIMA-Institute of Intelligent Industrial Technologies and Systems for Advanced Manufacturing, CNR-Italian National Research Council, 70124 Bari, Italy.

出版信息

Sensors (Basel). 2019 Apr 6;19(7):1649. doi: 10.3390/s19071649.

DOI:10.3390/s19071649
PMID:30959911
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6480279/
Abstract

Synthetic Aperture RADAR (SAR) is a radar imaging technique in which the relative motion of the sensor is used to synthesize a very long antenna and obtain high spatial resolution. Several algorithms for SAR data-focusing are well established and used by space agencies. Such algorithms are model-based, i.e., the radiometric and geometric information about the specific sensor must be well known, together with the ancillary data information acquired on board the platform. In the development of low-cost and lightweight SAR sensors, to be used in several application fields, the precise mission parameters and the knowledge of all the specific geometric and radiometric information about the sensor might complicate the hardware and software requirements. Despite SAR data processing being a well-established imaging technique, the proposed algorithm aims to exploit the SAR coherent illumination, demonstrating the possibility of extracting the reference functions, both in range and azimuth directions, when a strong point scatterer (either natural or manmade) is present in the scene. The Singular Value Decomposition is used to exploit the inherent redundancy present in the raw data matrix, and phase unwrapping and polynomial fitting are used to reconstruct clean versions of the reference functions. Fairly focused images on both synthetic and real raw data matrices without the knowledge of mission parameters and ancillary data information can be obtained; as a byproduct, azimuth beam pattern and estimates of a few other parameters have been extracted from the raw data itself. In a previous paper, authors introduced a preliminary work dealing with this problem and able to obtain good-quality images, if compared to the standard processing techniques. In this work, the proposed technique is described, and performance parameters are extracted to compare the proposed approach to RD, showing good adherence of the focused images and pulse responses.

摘要

合成孔径雷达(SAR)是一种雷达成像技术,其中传感器的相对运动被用来合成一个非常长的天线,并获得高空间分辨率。已经有几种 SAR 数据聚焦算法被空间机构所采用和确立。这些算法是基于模型的,也就是说,必须很好地了解特定传感器的辐射和几何信息,以及在平台上获取的辅助数据信息。在开发用于多个应用领域的低成本、轻量级 SAR 传感器时,精确的任务参数和对传感器所有特定几何和辐射信息的了解可能会使硬件和软件要求变得复杂。尽管 SAR 数据处理是一种成熟的成像技术,但所提出的算法旨在利用 SAR 相干照明,展示了当场景中存在强点散射体(无论是自然的还是人为的)时,从距离和方位两个方向提取参考函数的可能性。奇异值分解被用来利用原始数据矩阵中固有的冗余,相位解缠和多项式拟合被用来重建参考函数的干净版本。即使不知道任务参数和辅助数据信息,也可以从合成和真实原始数据矩阵中获得相当聚焦的图像;作为副产品,从原始数据本身提取了方位波束模式和其他几个参数的估计。在之前的一篇论文中,作者介绍了一项初步工作,该工作能够解决这个问题,并获得高质量的图像,如果与标准处理技术相比的话。在这项工作中,描述了所提出的技术,并提取了性能参数来比较所提出的方法与 RD,显示出聚焦图像和脉冲响应的良好一致性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/fcec2149f261/sensors-19-01649-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/dd58f343bf5e/sensors-19-01649-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/da95d490fd4b/sensors-19-01649-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/214b845012f1/sensors-19-01649-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/f6cc04571bae/sensors-19-01649-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/104489176c33/sensors-19-01649-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/37985f600a72/sensors-19-01649-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/d2d9b83b432f/sensors-19-01649-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/13c521bd4c05/sensors-19-01649-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/6c5e8d1b9f95/sensors-19-01649-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/ee2676b78dc5/sensors-19-01649-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/6b40498a58e9/sensors-19-01649-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/35ac7adffdb4/sensors-19-01649-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/fcec2149f261/sensors-19-01649-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/dd58f343bf5e/sensors-19-01649-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/da95d490fd4b/sensors-19-01649-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/214b845012f1/sensors-19-01649-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/f6cc04571bae/sensors-19-01649-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/104489176c33/sensors-19-01649-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/37985f600a72/sensors-19-01649-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/d2d9b83b432f/sensors-19-01649-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/13c521bd4c05/sensors-19-01649-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/6c5e8d1b9f95/sensors-19-01649-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/ee2676b78dc5/sensors-19-01649-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/6b40498a58e9/sensors-19-01649-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/35ac7adffdb4/sensors-19-01649-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1f74/6480279/fcec2149f261/sensors-19-01649-g013.jpg

相似文献

1
A Data-Driven Approach to SAR Data-Focusing.基于数据驱动的 SAR 数据聚焦方法。
Sensors (Basel). 2019 Apr 6;19(7):1649. doi: 10.3390/s19071649.
2
A Phase-Preserving Focusing Technique for TOPS Mode SAR Raw Data Based on Conventional Processing Methods.一种基于传统处理方法的TOPS模式合成孔径雷达原始数据保相聚焦技术。
Sensors (Basel). 2019 Jul 29;19(15):3321. doi: 10.3390/s19153321.
3
Sliding Spotlight Mode Imaging with GF-3 Spaceborne SAR Sensor.基于高分三号星载合成孔径雷达传感器的滑动聚光灯模式成像
Sensors (Basel). 2017 Dec 26;18(1):43. doi: 10.3390/s18010043.
4
An Azimuth Antenna Pattern Estimation Method Based on Doppler Spectrum in SAR Ocean Images.一种基于合成孔径雷达海洋图像中多普勒频谱的方位角天线方向图估计方法。
Sensors (Basel). 2018 Apr 3;18(4):1081. doi: 10.3390/s18041081.
5
Staring Spotlight SAR with Nonlinear Frequency Modulation Signal and Azimuth Non-Uniform Sampling for Low Sidelobe Imaging.基于非线性调频信号和方位向非均匀采样的凝视 Spotlight SAR 低旁瓣成像。
Sensors (Basel). 2021 Sep 28;21(19):6487. doi: 10.3390/s21196487.
6
SAR Image Formation Method with Azimuth Periodically Missing Data Based on RELAX Algorithm.基于RELAX算法的方位向周期性缺失数据的合成孔径雷达图像形成方法
Sensors (Basel). 2020 Dec 24;21(1):49. doi: 10.3390/s21010049.
7
SAR System for UAV Operation with Motion Error Compensation beyond the Resolution Cell.用于无人机操作且具有超出分辨单元运动误差补偿功能的合成孔径雷达(SAR)系统。
Sensors (Basel). 2008 May 23;8(5):3384-3405. doi: 10.3390/s8053384.
8
Azimuth Full-Aperture Processing of Spaceborne Squint SAR Data with Block Varying PRF.带块变 PRF 的星载斜视 SAR 数据方位全孔径处理。
Sensors (Basel). 2022 Nov 30;22(23):9328. doi: 10.3390/s22239328.
9
A High-Resolution SAR Focusing Experiment Based on GF-3 Staring Data.基于高分三号凝视数据的高分辨率合成孔径雷达聚焦实验
Sensors (Basel). 2018 Mar 22;18(4):943. doi: 10.3390/s18040943.
10
Compressive Sensing-Based SAR Image Reconstruction from Sparse Radar Sensor Data Acquisition in Automotive FMCW Radar System.基于压缩感知的汽车调频连续波雷达系统中稀疏雷达传感器数据采集的合成孔径雷达图像重建
Sensors (Basel). 2021 Nov 1;21(21):7283. doi: 10.3390/s21217283.

引用本文的文献

1
Special Issue "Synthetic Aperture Radar (SAR) Techniques and Applications".特刊征稿:合成孔径雷达(SAR)技术及应用
Sensors (Basel). 2020 Mar 27;20(7):1851. doi: 10.3390/s20071851.
2
Interferometric DEM-Assisted High Precision Imaging Method for ArcSAR.用于电弧合成孔径雷达(ArcSAR)的干涉测量数字高程模型(DEM)辅助高精度成像方法
Sensors (Basel). 2019 Jul 1;19(13):2921. doi: 10.3390/s19132921.