Guo Zhiqiang, Goh Hui Hwang, Li Xiuhua, Zhang Muqing, Li Yong
School of Electrical Engineering, Guangxi University, Nanning, China.
Guangxi Key Laboratory of Sugarcane Biology, Guangxi University, Nanning, China.
Front Plant Sci. 2023 Jul 24;14:1226329. doi: 10.3389/fpls.2023.1226329. eCollection 2023.
Accurate and dependable weed detection technology is a prerequisite for weed control robots to do autonomous weeding. Due to the complexity of the farmland environment and the resemblance between crops and weeds, detecting weeds in the field under natural settings is a difficult task. Existing deep learning-based weed detection approaches often suffer from issues such as monotonous detection scene, lack of picture samples and location information for detected items, low detection accuracy, etc. as compared to conventional weed detection methods. To address these issues, WeedNet-R, a vision-based network for weed identification and localization in sugar beet fields, is proposed. WeedNet-R adds numerous context modules to RetinaNet's neck in order to combine context information from many feature maps and so expand the effective receptive fields of the entire network. During model training, meantime, a learning rate adjustment method combining an untuned exponential warmup schedule and cosine annealing technique is implemented. As a result, the suggested method for weed detection is more accurate without requiring a considerable increase in model parameters. The WeedNet-R was trained and assessed using the OD-SugarBeets dataset, which is enhanced by manually adding the bounding box labels based on the publicly available agricultural dataset, i.e. SugarBeet2016. Compared to the original RetinaNet, the of the proposed WeedNet-R increased in the weed detection job in sugar beet fields by 4.65% to 92.30%. WeedNet-R's average precision for weed and sugar beet is 85.70% and 98.89%, respectively. WeedNet-R outperforms other sophisticated object detection algorithms in terms of detection accuracy while matching other single-stage detectors in terms of detection speed.
准确可靠的杂草检测技术是杂草控制机器人进行自主除草的前提条件。由于农田环境的复杂性以及作物与杂草之间的相似性,在自然环境下的田间检测杂草是一项艰巨的任务。与传统杂草检测方法相比,现有的基于深度学习的杂草检测方法常常存在检测场景单一、缺乏检测对象的图像样本和位置信息、检测精度低等问题。为了解决这些问题,提出了WeedNet-R,一种用于甜菜田杂草识别和定位的基于视觉的网络。WeedNet-R在RetinaNet的颈部添加了大量上下文模块,以便结合来自多个特征图的上下文信息,从而扩大整个网络的有效感受野。同时,在模型训练期间,实施了一种结合未调整指数预热策略和余弦退火技术的学习率调整方法。结果,所提出的杂草检测方法更加准确,而无需大幅增加模型参数。使用OD-SugarBeets数据集对WeedNet-R进行训练和评估,该数据集通过基于公开可用的农业数据集(即SugarBeet2016)手动添加边界框标签进行了增强。与原始RetinaNet相比,所提出的WeedNet-R在甜菜田杂草检测任务中的召回率提高了4.65%,达到92.30%。WeedNet-R对杂草和甜菜的平均精度分别为85.70%和98.89%。在检测精度方面,WeedNet-R优于其他先进的目标检测算法,而在检测速度方面与其他单阶段检测器相当。