Tian Fengkai, Vieira Caio Canella, Zhou Jing, Zhou Jianfeng, Chen Pengyin
Department of Biomedical, Biological and Chemical Engineering, University of Missouri, Columbia, MO 65211, USA.
Crop, Soil, and Environmental Sciences, Bumpers College, University of Arkansas, Fayetteville, AR 72701, USA.
Sensors (Basel). 2023 Mar 19;23(6):3241. doi: 10.3390/s23063241.
Weeds can cause significant yield losses and will continue to be a problem for agricultural production due to climate change. Dicamba is widely used to control weeds in monocot crops, especially genetically engineered dicamba-tolerant (DT) dicot crops, such as soybean and cotton, which has resulted in severe off-target dicamba exposure and substantial yield losses to non-tolerant crops. There is a strong demand for non-genetically engineered DT soybeans through conventional breeding selection. Public breeding programs have identified genetic resources that confer greater tolerance to off-target dicamba damage in soybeans. Efficient and high throughput phenotyping tools can facilitate the collection of a large number of accurate crop traits to improve the breeding efficiency. This study aimed to evaluate unmanned aerial vehicle (UAV) imagery and deep-learning-based data analytic methods to quantify off-target dicamba damage in genetically diverse soybean genotypes. In this research, a total of 463 soybean genotypes were planted in five different fields (different soil types) with prolonged exposure to off-target dicamba in 2020 and 2021. Crop damage due to off-target dicamba was assessed by breeders using a 1-5 scale with a 0.5 increment, which was further classified into three classes, i.e., susceptible (≥3.5), moderate (2.0 to 3.0), and tolerant (≤1.5). A UAV platform equipped with a red-green-blue (RGB) camera was used to collect images on the same days. Collected images were stitched to generate orthomosaic images for each field, and soybean plots were manually segmented from the orthomosaic images. Deep learning models, including dense convolutional neural network-121 (DenseNet121), residual neural network-50 (ResNet50), visual geometry group-16 (VGG16), and Depthwise Separable Convolutions (Xception), were developed to quantify crop damage levels. Results show that the DenseNet121 had the best performance in classifying damage with an accuracy of 82%. The 95% binomial proportion confidence interval showed a range of accuracy from 79% to 84% (-value ≤ 0.01). In addition, no extreme misclassifications (i.e., misclassification between tolerant and susceptible soybeans) were observed. The results are promising since soybean breeding programs typically aim to identify those genotypes with 'extreme' phenotypes (e.g., the top 10% of highly tolerant genotypes). This study demonstrates that UAV imagery and deep learning have great potential to high-throughput quantify soybean damage due to off-target dicamba and improve the efficiency of crop breeding programs in selecting soybean genotypes with desired traits.
杂草会导致显著的产量损失,并且由于气候变化,杂草仍将是农业生产面临的一个问题。麦草畏被广泛用于控制单子叶作物中的杂草,尤其是转基因耐麦草畏(DT)双子叶作物,如大豆和棉花,这导致了严重的麦草畏飘移危害以及非耐草甘膦作物的大量产量损失。通过常规育种选择对非转基因DT大豆有强烈需求。公共育种项目已经鉴定出了赋予大豆对麦草畏飘移危害更强耐受性的遗传资源。高效且高通量的表型分析工具能够促进大量准确作物性状的收集,以提高育种效率。本研究旨在评估无人机(UAV)图像和基于深度学习的数据分析方法,以量化不同基因型大豆中麦草畏飘移造成的危害。在本研究中,2020年和2021年在五个不同田地(不同土壤类型)种植了总共463个大豆基因型,使其长时间暴露于麦草畏飘移环境中。育种人员使用1 - 5级评分标准(以0.5为增量)评估麦草畏飘移造成的作物损害,并将其进一步分为三类,即敏感型(≥3.5)、中度(2.0至3.0)和耐受型(≤1.5)。使用配备红 - 绿 - 蓝(RGB)相机的无人机平台在同一天采集图像。采集的图像进行拼接以生成每个田地的正射镶嵌图像,并从正射镶嵌图像中手动分割出大豆地块。开发了深度学习模型,包括密集卷积神经网络 - 121(DenseNet121)、残差神经网络 - 50(ResNet50)、视觉几何组 - 16(VGG16)和深度可分离卷积(Xception),以量化作物损害水平。结果表明,DenseNet121在分类损害方面表现最佳,准确率为82%。95%二项式比例置信区间显示准确率范围为79%至84%(P值≤0.01)。此外,未观察到极端误分类情况(即耐受型和敏感型大豆之间的误分类)。这些结果很有前景,因为大豆育种项目通常旨在鉴定那些具有“极端”表型的基因型(例如,前10%的高耐受性基因型)。本研究表明,无人机图像和深度学习在高通量量化麦草畏飘移造成的大豆损害以及提高作物育种项目选择具有所需性状大豆基因型的效率方面具有巨大潜力。