• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于甜菜田基于图像检测的深度卷积神经网络

Deep convolutional neural networks for image-based detection in sugar beet fields.

作者信息

Gao Junfeng, French Andrew P, Pound Michael P, He Yong, Pridmore Tony P, Pieters Jan G

机构信息

1Lincoln Institute for Agri-food Technology, University of Lincoln, Lincoln, Riseholme Park, LN2 2LG UK.

2Department of Biosystems Engineering, Ghent University, Coupure Links 653, 9000 Ghent, Belgium.

出版信息

Plant Methods. 2020 Mar 5;16:29. doi: 10.1186/s13007-020-00570-z. eCollection 2020.

DOI:10.1186/s13007-020-00570-z
PMID:32165909
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7059384/
Abstract

BACKGROUND

(hedge bindweed) detection in sugar beet fields remains a challenging problem due to variation in appearance of plants, illumination changes, foliage occlusions, and different growth stages under field conditions. Current approaches for weed and crop recognition, segmentation and detection rely predominantly on conventional machine-learning techniques that require a large set of hand-crafted features for modelling. These might fail to generalize over different fields and environments.

RESULTS

Here, we present an approach that develops a deep convolutional neural network (CNN) based on the tiny YOLOv3 architecture for and sugar beet detection. We generated 2271 synthetic images, before combining these images with 452 field images to train the developed model. YOLO anchor box sizes were calculated from the training dataset using a k-means clustering approach. The resulting model was tested on 100 field images, showing that the combination of synthetic and original field images to train the developed model could improve the mean average precision (mAP) metric from 0.751 to 0.829 compared to using collected field images alone. We also compared the performance of the developed model with the YOLOv3 and Tiny YOLO models. The developed model achieved a better trade-off between accuracy and speed. Specifically, the average precisions (APs@IoU0.5) of and sugar beet were 0.761 and 0.897 respectively with 6.48 ms inference time per image (800 × 1200) on a NVIDIA Titan X GPU environment.

CONCLUSION

The developed model has the potential to be deployed on an embedded mobile platform like the Jetson TX for online weed detection and management due to its high-speed inference. It is recommendable to use synthetic images and empirical field images together in training stage to improve the performance of models.

摘要

背景

由于田间条件下植物外观的变化、光照变化、叶片遮挡以及不同的生长阶段,在甜菜田中检测(篱打碗花)仍然是一个具有挑战性的问题。当前用于杂草和作物识别、分割及检测的方法主要依赖传统机器学习技术,这些技术需要大量手工制作的特征进行建模。这些方法可能无法在不同的田地和环境中通用。

结果

在此,我们提出一种方法,基于微小YOLOv3架构开发一个用于 和甜菜检测的深度卷积神经网络(CNN)。我们生成了2271张合成图像,然后将这些图像与452张田间图像相结合来训练所开发的模型。使用k均值聚类方法从训练数据集中计算YOLO锚框大小。在100张田间图像上对所得模型进行测试,结果表明,与仅使用收集的田间图像相比,将合成图像和原始田间图像相结合来训练所开发的模型可以将平均精度均值(mAP)指标从0.751提高到0.829。我们还将所开发模型的性能与YOLOv3和微小YOLO模型进行了比较。所开发的模型在准确性和速度之间实现了更好的平衡。具体而言,在NVIDIA Titan X GPU环境下,每张图像(800×1200)推理时间为6.48毫秒时, 和甜菜的平均精度(APs@IoU0.5)分别为0.761和0.897。

结论

所开发的模型由于其高速推理,有潜力部署在如Jetson TX这样的嵌入式移动平台上用于在线杂草检测和管理。建议在训练阶段同时使用合成图像和实际田间图像以提高模型性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/fce0f7beac85/13007_2020_570_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/a18e55d388ab/13007_2020_570_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/6a9b67c3ab5f/13007_2020_570_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/90348580ce28/13007_2020_570_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/a8cfce2c744d/13007_2020_570_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/f9a414749dd8/13007_2020_570_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/60bcc0e91c9f/13007_2020_570_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/60a726fdedea/13007_2020_570_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/2bf839c7ade3/13007_2020_570_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/fce0f7beac85/13007_2020_570_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/a18e55d388ab/13007_2020_570_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/6a9b67c3ab5f/13007_2020_570_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/90348580ce28/13007_2020_570_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/a8cfce2c744d/13007_2020_570_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/f9a414749dd8/13007_2020_570_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/60bcc0e91c9f/13007_2020_570_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/60a726fdedea/13007_2020_570_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/2bf839c7ade3/13007_2020_570_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e71e/7059384/fce0f7beac85/13007_2020_570_Fig9_HTML.jpg

相似文献

1
Deep convolutional neural networks for image-based detection in sugar beet fields.用于甜菜田基于图像检测的深度卷积神经网络
Plant Methods. 2020 Mar 5;16:29. doi: 10.1186/s13007-020-00570-z. eCollection 2020.
2
Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton.利用合成图像训练棉花杂草检测和生物量估计的深度学习模型。
Sci Rep. 2022 Nov 15;12(1):19580. doi: 10.1038/s41598-022-23399-z.
3
WeedNet-R: a sugar beet field weed detection algorithm based on enhanced RetinaNet and context semantic fusion.WeedNet-R:一种基于增强型RetinaNet和上下文语义融合的甜菜田杂草检测算法。
Front Plant Sci. 2023 Jul 24;14:1226329. doi: 10.3389/fpls.2023.1226329. eCollection 2023.
4
Detection and analysis of wheat spikes using Convolutional Neural Networks.使用卷积神经网络对小麦穗进行检测与分析。
Plant Methods. 2018 Nov 15;14:100. doi: 10.1186/s13007-018-0366-8. eCollection 2018.
5
Nature-Inspired Search Method and Custom Waste Object Detection and Classification Model for Smart Waste Bin.受自然启发的搜索方法和定制的废物对象检测与分类模型,用于智能垃圾桶。
Sensors (Basel). 2022 Aug 18;22(16):6176. doi: 10.3390/s22166176.
6
Efficient Deep Learning Architecture for Detection and Recognition of Thyroid Nodules.高效深度学习架构用于甲状腺结节的检测和识别。
Comput Intell Neurosci. 2020 Jul 29;2020:1242781. doi: 10.1155/2020/1242781. eCollection 2020.
7
Evaluation of Inference Performance of Deep Learning Models for Real-Time Weed Detection in an Embedded Computer.嵌入式计算机中深度学习模型用于实时杂草检测的推理性能评估
Sensors (Basel). 2024 Jan 14;24(2):514. doi: 10.3390/s24020514.
8
Evaluation of different deep convolutional neural networks for detection of broadleaf weed seedlings in wheat.不同深度卷积神经网络在小麦阔叶杂草幼苗检测中的评估。
Pest Manag Sci. 2022 Feb;78(2):521-529. doi: 10.1002/ps.6656. Epub 2021 Oct 5.
9
Performances of the LBP Based Algorithm over CNN Models for Detecting Crops and Weeds with Similar Morphologies.基于 LBP 的算法在 CNN 模型中对具有相似形态的作物和杂草进行检测的性能。
Sensors (Basel). 2020 Apr 14;20(8):2193. doi: 10.3390/s20082193.
10
Tomato Anomalies Detection in Greenhouse Scenarios Based on YOLO-Dense.基于YOLO-Dense的温室场景下番茄异常检测
Front Plant Sci. 2021 Apr 9;12:634103. doi: 10.3389/fpls.2021.634103. eCollection 2021.

引用本文的文献

1
Diagnosis of non-puerperal mastitis based on "whole tongue" features: non-invasive biomarker mining and diagnostic model construction.基于“全舌”特征的非产褥期乳腺炎诊断:非侵入性生物标志物挖掘与诊断模型构建
Front Cell Infect Microbiol. 2025 Jul 28;15:1602883. doi: 10.3389/fcimb.2025.1602883. eCollection 2025.
2
Smart weed recognition in saffron fields based on an improved EfficientNetB0 model and RGB images.基于改进的EfficientNetB0模型和RGB图像的藏红花田杂草智能识别
Sci Rep. 2025 May 2;15(1):15412. doi: 10.1038/s41598-025-00331-9.
3
Detection of surface defects in soybean seeds based on improved Yolov9.

本文引用的文献

1
Potato Virus Y Detection in Seed Potatoes Using Deep Learning on Hyperspectral Images.基于高光谱图像深度学习的种薯马铃薯Y病毒检测
Front Plant Sci. 2019 Mar 1;10:209. doi: 10.3389/fpls.2019.00209. eCollection 2019.
2
An explainable deep machine vision framework for plant stress phenotyping.用于植物胁迫表型分析的可解释深度机器视觉框架
Proc Natl Acad Sci U S A. 2018 May 1;115(18):4613-4618. doi: 10.1073/pnas.1716999115. Epub 2018 Apr 16.
3
Deep machine learning provides state-of-the-art performance in image-based plant phenotyping.
基于改进的Yolov9检测大豆种子表面缺陷
Sci Rep. 2025 Apr 12;15(1):12631. doi: 10.1038/s41598-025-92429-3.
4
A Review of CNN Applications in Smart Agriculture Using Multimodal Data.基于多模态数据的卷积神经网络在智慧农业中的应用综述
Sensors (Basel). 2025 Jan 15;25(2):472. doi: 10.3390/s25020472.
5
Improvement of the YOLOv8 Model in the Optimization of the Weed Recognition Algorithm in Cotton Field.基于棉田杂草识别算法优化的YOLOv8模型改进
Plants (Basel). 2024 Jul 4;13(13):1843. doi: 10.3390/plants13131843.
6
Towards practical object detection for weed spraying in precision agriculture.面向精准农业中杂草喷洒的实用目标检测
Front Plant Sci. 2023 Nov 3;14:1183277. doi: 10.3389/fpls.2023.1183277. eCollection 2023.
7
Object Detection for Agricultural Vehicles: Ensemble Method Based on Hierarchy of Classes.农用车辆的目标检测:基于类层次结构的集成方法
Sensors (Basel). 2023 Aug 20;23(16):7285. doi: 10.3390/s23167285.
8
Overcoming field variability: unsupervised domain adaptation for enhanced crop-weed recognition in diverse farmlands.克服田间变异性:用于在不同农田中增强作物-杂草识别的无监督域适应
Front Plant Sci. 2023 Aug 9;14:1234616. doi: 10.3389/fpls.2023.1234616. eCollection 2023.
9
Towards deep learning based smart farming for intelligent weeds management in crops.迈向基于深度学习的智能农业,用于作物中的智能杂草管理。
Front Plant Sci. 2023 Jul 28;14:1211235. doi: 10.3389/fpls.2023.1211235. eCollection 2023.
10
WeedNet-R: a sugar beet field weed detection algorithm based on enhanced RetinaNet and context semantic fusion.WeedNet-R:一种基于增强型RetinaNet和上下文语义融合的甜菜田杂草检测算法。
Front Plant Sci. 2023 Jul 24;14:1226329. doi: 10.3389/fpls.2023.1226329. eCollection 2023.
深度学习在基于图像的植物表型分析中提供了最先进的性能。
Gigascience. 2017 Oct 1;6(10):1-10. doi: 10.1093/gigascience/gix083.
4
Going deeper in the automated identification of Herbarium specimens.深入探讨植物标本馆标本的自动识别
BMC Evol Biol. 2017 Aug 11;17(1):181. doi: 10.1186/s12862-017-1014-z.
5
Deep Count: Fruit Counting Based on Deep Simulated Learning.深度计数:基于深度模拟学习的水果计数。
Sensors (Basel). 2017 Apr 20;17(4):905. doi: 10.3390/s17040905.
6
Localization and Classification of Paddy Field Pests using a Saliency Map and Deep Convolutional Neural Network.基于显著图和深度卷积神经网络的稻田害虫定位与分类
Sci Rep. 2016 Feb 11;6:20410. doi: 10.1038/srep20410.
7
Deep learning.深度学习。
Nature. 2015 May 28;521(7553):436-44. doi: 10.1038/nature14539.
8
Weed control changes and genetically modified herbicide tolerant crops in the USA 1996-2012.1996 - 2012年美国的杂草控制变化与转基因抗除草剂作物
GM Crops Food. 2014;5(4):321-32. doi: 10.4161/21645698.2014.958930.
9
The future for weed control and technology.杂草控制与技术的未来。
Pest Manag Sci. 2014 Sep;70(9):1329-39. doi: 10.1002/ps.3706. Epub 2014 Jan 31.
10
Blind inverse gamma correction.盲逆伽马校正
IEEE Trans Image Process. 2001;10(10):1428-33. doi: 10.1109/83.951529.