Pipia Luca, Muñoz-Marí Jordi, Amin Eatidal, Belda Santiago, Camps-Valls Gustau, Verrelst Jochem
Image Processing Laboratory (IPL), Parc Científic, Universitat de València, 46980, Paterna, València, Spain.
Remote Sens Environ. 2019 Dec 15;235. doi: 10.1016/j.rse.2019.111452.
The availability of satellite optical information is often hampered by the natural presence of clouds, which can be problematic for many applications. Persistent clouds over agricultural fields can mask key stages of crop growth, leading to unreliable yield predictions. Synthetic Aperture Radar (SAR) provides all-weather imagery which can potentially overcome this limitation, but given its high and distinct sensitivity to different surface properties, the fusion of SAR and optical data still remains an open challenge. In this work, we propose the use of Multi-Output Gaussian Process (MOGP) regression, a machine learning technique that learns automatically the statistical relationships among multisensor time series, to detect vegetated areas over which the synergy between SAR-optical imageries is profitable. For this purpose, we use the Sentinel-1 Radar Vegetation Index (RVI) and Sentinel-2 Leaf Area Index (LAI) time series over a study area in north west of the Iberian peninsula. Through a physical interpretation of MOGP trained models, we show its ability to provide estimations of LAI even over cloudy periods using the information shared with RVI, which guarantees the solution keeps always tied to real measurements. Results demonstrate the advantage of MOGP especially for long data gaps, where optical-based methods notoriously fail. The leave-one-image-out assessment technique applied to the whole vegetation cover shows MOGP predictions improve standard GP estimations over short-time gaps (R of 74% vs 68%, RMSE of 0.4 vs 0.44 [ ]) and especially over long-time gaps (R of 33% vs 12%, RMSE of 0.5 vs 1.09 [ ]). A second assessment is focused on crop-specific regions, clustering pixels fulfilling specific model conditions where the synergy is profitable. Results reveal the MOGP performance is crop type and crop stage dependent. For long time gaps, best R are obtained over maize, ranging from 0.1 (tillering) to 0.36 (development) up to 0.81 (maturity); for moderate time gap, R = 0.93 (maturity) is obtained. Crops such as wheat, oats, rye and barley, can profit from the LAI-RVI synergy, with R varying between 0.4 and 0.6. For beet or potatoes, MOGP provides poorer results, but alternative descriptors to RVI should be tested for these specific crops in the future before discarding synergy real benefits. In conclusion, active-passive sensor fusion with MOGP represents a novel and promising approach to cope with crop monitoring over cloud-dominated areas.
卫星光学信息的可用性常常受到云层自然存在的阻碍,这对许多应用来说可能是个问题。农田上空持续的云层会掩盖作物生长的关键阶段,导致产量预测不可靠。合成孔径雷达(SAR)提供全天候图像,有可能克服这一限制,但鉴于其对不同地表特性具有高度且独特的敏感性,SAR与光学数据的融合仍然是一个悬而未决的挑战。在这项工作中,我们提出使用多输出高斯过程(MOGP)回归,这是一种机器学习技术,可自动学习多传感器时间序列之间的统计关系,以检测SAR与光学图像协同作用有益的植被区域。为此,我们使用伊比利亚半岛西北部一个研究区域的哨兵-1雷达植被指数(RVI)和哨兵-2叶面积指数(LAI)时间序列。通过对经MOGP训练的模型进行物理解释,我们展示了其即使在多云时期也能利用与RVI共享的信息来提供LAI估计值的能力,这确保了该解决方案始终与实际测量值相关联。结果表明了MOGP的优势,特别是在基于光学的方法明显失效的长数据间隙情况下。应用于整个植被覆盖的留一图像法评估技术表明,MOGP预测在短时间间隙(相关系数R为74%对68%,均方根误差RMSE为0.4对0.44[ ])尤其是长时间间隙(相关系数R为33%对12%,均方根误差RMSE为0.5对1.09[ ])方面优于标准高斯过程(GP)估计。第二项评估聚焦于特定作物区域,对满足协同作用有益的特定模型条件的像素进行聚类。结果表明MOGP的性能取决于作物类型和作物生长阶段。对于长时间间隙,玉米的最佳相关系数R范围为0.1(分蘖期)至0.36(发育期)直至0.81(成熟期);对于中等时间间隙,相关系数R = 0.93(成熟期)。小麦、燕麦、黑麦和大麦等作物可从LAI - RVI协同作用中受益,相关系数R在0.4至0.6之间。对于甜菜或土豆,MOGP的结果较差,但在未来放弃协同作用的实际益处之前,应针对这些特定作物测试替代RVI的描述符。总之,利用MOGP进行有源 - 无源传感器融合是一种应对多云地区作物监测的新颖且有前景的方法。