College of Geomatics and Geoinformation, Guilin University of Technology, Guilin 541004, China.
Ecological Spatiotemporal Big Data Perception Service Laboratory, Guilin University of Technology, Guilin 541004, China.
Sensors (Basel). 2023 Jun 21;23(13):5807. doi: 10.3390/s23135807.
With the increasing demand for remote sensing image applications, extracting the required images from a huge set of remote sensing images has become a hot topic. The previous retrieval methods cannot guarantee the efficiency, accuracy, and interpretability in the retrieval process. Therefore, we propose a bag-of-words association mapping method that can explain the semantic derivation process of remote sensing images. The method constructs associations between low-level features and high-level semantics through visual feature word packets. An improved FP-Growth method is proposed to achieve the construction of strong association rules to semantics. A feedback mechanism is established to improve the accuracy of subsequent retrievals by reducing the semantic probability of incorrect retrieval results. The public datasets AID and NWPU-RESISC45 were used to validate these experiments. The experimental results show that the average accuracies of the two datasets reach 87.5% and 90.8%, which are 22.5% and 20.3% higher than VGG16, and 17.6% and 15.6% higher than ResNet18, respectively. The experimental results were able to validate the effectiveness of our proposed method.
随着遥感图像应用需求的不断增加,从海量遥感图像集中提取所需图像已成为一个热门话题。以往的检索方法无法保证检索过程中的效率、准确性和可解释性。因此,我们提出了一种词袋关联映射方法,可以解释遥感图像的语义推导过程。该方法通过视觉特征词包构建低层特征与高层语义之间的关联。提出了一种改进的 FP-Growth 方法,通过构建强关联规则来实现语义的表示。建立了反馈机制,通过减少错误检索结果的语义概率,提高了后续检索的准确性。使用公共数据集 AID 和 NWPU-RESISC45 对这些实验进行了验证。实验结果表明,两个数据集的平均准确率分别达到 87.5%和 90.8%,比 VGG16 分别提高了 22.5%和 20.3%,比 ResNet18 分别提高了 17.6%和 15.6%。实验结果验证了所提出方法的有效性。