Gu Chunyan, Wang Daoyong, Zhang Huihui, Zhang Jian, Zhang Dongyan, Liang Dong
Institute of Plant Protection and Agro-products Safety, Anhui Academy of Agricultural Sciences, Hefei, China.
National Engineering Research Center for Agro-Ecological Big Data Analysis & Application, Anhui University, Hefei, China.
Front Plant Sci. 2021 Jan 21;11:599886. doi: 10.3389/fpls.2020.599886. eCollection 2020.
A fast and nondestructive method for recognizing the severity of wheat head blight (FHB) can effectively reduce fungicide use and associated costs in wheat production. This study proposed a feature fusion method based on deep convolution and shallow features derived from the high-resolution digital Red-green-blue (RGB) images of wheat FHB at different disease severity levels. To test the robustness of the proposed method, the RGB images were taken under different influence factors including light condition, camera shooting angle, image resolution, and crop growth period. All images were preprocessed to eliminate background noises to improve recognition accuracy. The AlexNet model parameters trained by the ImageNet 2012 dataset were transferred to the test dataset to extract the deep convolution feature of wheat FHB. Next, the color and texture features of wheat ears were extracted as shallow features. Then, the Relief-F algorithm was used to fuse the deep convolution feature and shallow features as the final FHB features. Finally, the random forest was used to classify and identify the features of different FHB severity levels. Results show that the recognition accuracy of the proposed fusion feature model was higher than those of models using other features in all conditions. The highest recognition accuracy of severity levels was obtained when images were taken under indoor conditions, with high resolution (12 MB pixels), at 90° shooting angle during the crop filling period. The Relief-F algorithm assigned different weights to the features under different influence factors; it made the fused feature model more robust and improved the ability to recognize wheat FHB severity levels using RGB images.
一种快速且无损识别小麦赤霉病严重程度的方法能够有效减少小麦生产中杀菌剂的使用及相关成本。本研究提出了一种基于深度卷积和从不同病害严重程度水平的小麦赤霉病高分辨率数字红绿蓝(RGB)图像中提取的浅层特征的特征融合方法。为测试所提方法的稳健性,在包括光照条件、相机拍摄角度、图像分辨率和作物生长周期等不同影响因素下拍摄RGB图像。对所有图像进行预处理以消除背景噪声,从而提高识别准确率。将由ImageNet 2012数据集训练的AlexNet模型参数转移到测试数据集,以提取小麦赤霉病的深度卷积特征。接下来,提取麦穗的颜色和纹理特征作为浅层特征。然后,使用Relief-F算法将深度卷积特征和浅层特征融合作为最终的赤霉病特征。最后,使用随机森林对不同赤霉病严重程度水平的特征进行分类和识别。结果表明,在所有条件下,所提融合特征模型的识别准确率均高于使用其他特征的模型。当在室内条件下、高分辨率(12兆像素)、作物灌浆期90°拍摄角度下拍摄图像时,获得了最高的严重程度水平识别准确率。Relief-F算法在不同影响因素下为特征赋予不同权重;这使得融合特征模型更稳健,并提高了使用RGB图像识别小麦赤霉病严重程度水平的能力。