Suppr超能文献

利用无人机影像和深度学习对大豆淹水胁迫响应进行鉴定

Qualification of Soybean Responses to Flooding Stress Using UAV-Based Imagery and Deep Learning.

作者信息

Zhou Jing, Mou Huawei, Zhou Jianfeng, Ali Md Liakat, Ye Heng, Chen Pengyin, Nguyen Henry T

机构信息

Division of Food Systems and Bioengineering, University of Missouri, Columbia, MO 65211, USA.

Bioenergy and Environment Science & Technology Laboratory, College of Engineering, China Agricultural University, Beijing 100083, China.

出版信息

Plant Phenomics. 2021 Jun 28;2021:9892570. doi: 10.34133/2021/9892570. eCollection 2021.

Abstract

Soybean is sensitive to flooding stress that may result in poor seed quality and significant yield reduction. Soybean production under flooding could be sustained by developing flood-tolerant cultivars through breeding programs. Conventionally, soybean tolerance to flooding in field conditions is evaluated by visually rating the shoot injury/damage due to flooding stress, which is labor-intensive and subjective to human error. Recent developments of field high-throughput phenotyping technology have shown great potential in measuring crop traits and detecting crop responses to abiotic and biotic stresses. The goal of this study was to investigate the potential in estimating flood-induced soybean injuries using UAV-based image features collected at different flight heights. The flooding injury score (FIS) of 724 soybean breeding plots was taken visually by breeders when soybean showed obvious injury symptoms. Aerial images were taken on the same day using a five-band multispectral and an infrared (IR) thermal camera at 20, 50, and 80 m above ground. Five image features, i.e., canopy temperature, normalized difference vegetation index, canopy area, width, and length, were extracted from the images at three flight heights. A deep learning model was used to classify the soybean breeding plots to five FIS ratings based on the extracted image features. Results show that the image features were significantly different at three flight heights. The best classification performance was obtained by the model developed using image features at 20 m with 0.9 for the five-level FIS. The results indicate that the proposed method is very promising in estimating FIS for soybean breeding.

摘要

大豆对淹水胁迫敏感,这可能导致种子质量差和产量大幅下降。通过育种计划培育耐淹品种,可以维持淹水条件下的大豆生产。传统上,大豆在田间条件下的耐淹性是通过对淹水胁迫造成的地上部损伤/损害进行目视评分来评估的,这既耗费人力,又容易出现人为误差。田间高通量表型技术的最新发展在测量作物性状以及检测作物对非生物和生物胁迫的反应方面显示出了巨大潜力。本研究的目的是调查利用在不同飞行高度收集的基于无人机的图像特征来估计淹水诱导的大豆损伤的潜力。当大豆出现明显损伤症状时,育种人员对724个大豆育种小区的淹水损伤评分(FIS)进行了目视评估。在同一天,使用五波段多光谱和红外(IR)热成像相机在离地面20、50和80米的高度拍摄了航空图像。从三个飞行高度的图像中提取了五个图像特征,即冠层温度、归一化植被指数、冠层面积、宽度和长度。基于提取的图像特征,使用深度学习模型将大豆育种小区分类为五个FIS等级。结果表明,三个飞行高度的图像特征存在显著差异。使用20米高度的图像特征开发的模型获得了最佳分类性能,五级FIS的准确率为0.9。结果表明,所提出的方法在估计大豆育种的FIS方面非常有前景。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/529f/8261669/7f25a7ff3f4d/PLANTPHENOMICS2021-9892570.001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验