Servera Jorge Vicent, Alonso Luis, Martino Luca, Sabater Neus, Verrelst Jochem, Camps-Valls Gustau, Moreno José
Image Processing Laboratory, University of Valencia, 46980 Valencia, Spain.
IEEE Trans Geosci Remote Sens. 2019 Feb;57(2):1040-1048. doi: 10.1109/tgrs.2018.2864517.
Physically based radiative transfer models (RTMs) are widely used in Earth observation to understand the radiation processes occurring on the Earth's surface and their interactions with water, vegetation, and atmosphere. Through continuous improvements, RTMs have increased in accuracy and representativity of complex scenes at expenses of an increase in complexity and computation time, making them impractical in various remote sensing applications. To overcome this limitation, the common practice is to precompute large lookup tables (LUTs) for their later interpolation. To further reduce the RTM computation burden and the error in LUT interpolation, we have developed a method to automatically select the minimum and optimal set of input-output points (nodes) to be included in an LUT. We present the gradient-based automatic LUT generator algorithm (GALGA), which relies on the notion of an acquisition function that incorporates: 1) the Jacobian evaluation of an RTM and 2) the information about the multivariate distribution of the current nodes. We illustrate the capabilities of GALGA in the automatic construction and optimization of MODTRAN-based LUTs of different dimensions of the input variables space. Our results indicate that when compared with a pseudorandom homogeneous distribution of the LUT nodes, GALGA reduces:1) the LUT size by >24%; 2) the computation time by 27%; and 3) the maximum interpolation relative errors by at least 10%. It is concluded that an automatic LUT design might benefit from the methodology proposed in GALGA to reduce interpolation errors and computation time in computationally expensive RTMs.
基于物理的辐射传输模型(RTMs)在地球观测中被广泛应用,以了解地球表面发生的辐射过程及其与水、植被和大气的相互作用。通过不断改进,RTMs在精度和对复杂场景的代表性方面有所提高,但代价是复杂度和计算时间增加,这使得它们在各种遥感应用中不切实际。为克服这一限制,常见做法是预先计算大型查找表(LUTs)以便后续插值。为进一步减轻RTM计算负担和LUT插值中的误差,我们开发了一种方法,用于自动选择要包含在LUT中的最小且最优的输入-输出点(节点)集。我们提出了基于梯度的自动LUT生成器算法(GALGA),该算法依赖于一种采集函数的概念,该函数包含:1)RTM的雅可比矩阵评估;2)关于当前节点多元分布的信息。我们展示了GALGA在自动构建和优化基于MODTRAN的不同输入变量空间维度的LUT方面的能力。我们的结果表明,与LUT节点的伪随机均匀分布相比,GALGA可减少:1)LUT大小超过24%;2)计算时间27%;3)最大插值相对误差至少10%。得出的结论是,自动LUT设计可能受益于GALGA中提出的方法,以减少计算成本高昂的RTM中的插值误差和计算时间。