• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用计算机视觉进行实时作物行检测——在农业机器人中的应用

Real-time crop row detection using computer vision- application in agricultural robots.

作者信息

Khan Md Nazmuzzaman, Rahi Adibuzzaman, Rajendran Veera P, Al Hasan Mohammad, Anwar Sohel

机构信息

Lead Research Scientist (Kroger), 84.51°, Cincinnati, OH, United States.

Mechatronics and Autonomous Research Lab, Purdue University, Mechanical Engineering, Indianapolis, IN, United States.

出版信息

Front Artif Intell. 2024 Oct 30;7:1435686. doi: 10.3389/frai.2024.1435686. eCollection 2024.

DOI:10.3389/frai.2024.1435686
PMID:39540198
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11558879/
Abstract

The goal of achieving autonomous navigation for agricultural robots poses significant challenges, mostly arising from the substantial natural variations in crop row images as a result of weather conditions and the growth stages of crops. The processing of the detection algorithm also must be significantly low for real-time applications. In order to address the aforementioned requirements, we propose a crop row detection algorithm that has the following features: Firstly, a projective transformation is applied to transform the camera view and a color-based segmentation is employed to distinguish crop and weed from the background. Secondly, a clustering algorithm is used to differentiate between the crop and weed pixels. Lastly, a robust line-fitting approach is implemented to detect crop rows. The proposed algorithm is evaluated throughout a diverse range of scenarios, and its efficacy is assessed in comparison to four distinct existing solutions. The algorithm achieves an overall intersection over union (IOU) of 0.73 and exhibits robustness in challenging scenarios with high weed growth. The experiments conducted on real-time video featuring challenging scenarios show that our proposed algorithm exhibits a detection accuracy of over 90% and is a viable option for real-time implementation. With the high accuracy and low inference time, the proposed methodology offers a viable solution for autonomous navigation of agricultural robots in a crop field without damaging the crop and thus can serve as a foundation for future research.

摘要

实现农业机器人自主导航的目标面临着重大挑战,这些挑战主要源于天气条件和作物生长阶段导致的作物行图像的大量自然变化。对于实时应用而言,检测算法的处理速度也必须显著降低。为了满足上述要求,我们提出了一种具有以下特点的作物行检测算法:首先,应用投影变换来转换相机视角,并采用基于颜色的分割方法将作物和杂草与背景区分开来。其次,使用聚类算法区分作物像素和杂草像素。最后,实施一种稳健的直线拟合方法来检测作物行。所提出的算法在各种不同场景下进行了评估,并与四种不同的现有解决方案进行比较来评估其有效性。该算法实现了0.73的总体交并比(IOU),并且在杂草生长旺盛的具有挑战性的场景中表现出稳健性。在具有挑战性场景的实时视频上进行的实验表明,我们提出的算法检测准确率超过90%,是实时实现的可行选择。凭借高精度和低推理时间,所提出的方法为作物田间农业机器人的自主导航提供了一种可行的解决方案,且不会损害作物,因此可作为未来研究的基础。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/fac3148a0931/frai-07-1435686-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/99d9ec7f79b7/frai-07-1435686-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/bb714b469f0a/frai-07-1435686-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/7657bccb8613/frai-07-1435686-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/e9846a1e4c2d/frai-07-1435686-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/ea45249be9d5/frai-07-1435686-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/ad1d1d3ac900/frai-07-1435686-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/2feb79f2f131/frai-07-1435686-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/84c5bf559069/frai-07-1435686-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/96297b4c12a3/frai-07-1435686-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/8aa033f5aa8c/frai-07-1435686-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/38ec3066ce40/frai-07-1435686-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/fac3148a0931/frai-07-1435686-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/99d9ec7f79b7/frai-07-1435686-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/bb714b469f0a/frai-07-1435686-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/7657bccb8613/frai-07-1435686-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/e9846a1e4c2d/frai-07-1435686-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/ea45249be9d5/frai-07-1435686-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/ad1d1d3ac900/frai-07-1435686-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/2feb79f2f131/frai-07-1435686-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/84c5bf559069/frai-07-1435686-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/96297b4c12a3/frai-07-1435686-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/8aa033f5aa8c/frai-07-1435686-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/38ec3066ce40/frai-07-1435686-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/1e8d/11558879/fac3148a0931/frai-07-1435686-g0012.jpg

相似文献

1
Real-time crop row detection using computer vision- application in agricultural robots.利用计算机视觉进行实时作物行检测——在农业机器人中的应用
Front Artif Intell. 2024 Oct 30;7:1435686. doi: 10.3389/frai.2024.1435686. eCollection 2024.
2
Improved Real-Time Semantic Segmentation Network Model for Crop Vision Navigation Line Detection.用于作物视觉导航线检测的改进实时语义分割网络模型
Front Plant Sci. 2022 Jun 2;13:898131. doi: 10.3389/fpls.2022.898131. eCollection 2022.
3
Inter-row navigation line detection for cotton with broken rows.用于断行棉花的行间导航线检测
Plant Methods. 2022 Jul 2;18(1):90. doi: 10.1186/s13007-022-00913-y.
4
Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields.基于自适应多感兴趣区域的草莓田自主作物行导航。
Sensors (Basel). 2020 Sep 14;20(18):5249. doi: 10.3390/s20185249.
5
WRA-Net: Wide Receptive Field Attention Network for Motion Deblurring in Crop and Weed Image.WRA-Net:用于作物和杂草图像运动去模糊的广域感受野注意力网络
Plant Phenomics. 2023 Apr 5;5:0031. doi: 10.34133/plantphenomics.0031. eCollection 2023.
6
Merge Fuzzy Visual Servoing and GPS-Based Planning to Obtain a Proper Navigation Behavior for a Small Crop-Inspection Robot.融合模糊视觉伺服与基于全球定位系统的规划,以获取小型作物检测机器人的适当导航行为。
Sensors (Basel). 2016 Feb 24;16(3):276. doi: 10.3390/s16030276.
7
Tasseled Crop Rows Detection Based on Micro-Region of Interest and Logarithmic Transformation.基于微感兴趣区域和对数变换的穗状作物行检测
Front Plant Sci. 2022 Jun 27;13:916474. doi: 10.3389/fpls.2022.916474. eCollection 2022.
8
Improving the maize crop row navigation line recognition method of YOLOX.改进YOLOX的玉米作物行导航线识别方法。
Front Plant Sci. 2024 Mar 28;15:1338228. doi: 10.3389/fpls.2024.1338228. eCollection 2024.
9
Instance segmentation for the fine detection of crop and weed plants by precision agricultural robots.用于精准农业机器人对作物和杂草进行精细检测的实例分割。
Appl Plant Sci. 2020 Jul 28;8(7):e11373. doi: 10.1002/aps3.11373. eCollection 2020 Jul.
10
The integration of GPS and visual navigation for autonomous navigation of an Ackerman steering mobile robot in cotton fields.用于阿克曼转向移动机器人在棉田自主导航的GPS与视觉导航集成
Front Robot AI. 2024 Apr 12;11:1359887. doi: 10.3389/frobt.2024.1359887. eCollection 2024.

本文引用的文献

1
Tasseled Crop Rows Detection Based on Micro-Region of Interest and Logarithmic Transformation.基于微感兴趣区域和对数变换的穗状作物行检测
Front Plant Sci. 2022 Jun 27;13:916474. doi: 10.3389/fpls.2022.916474. eCollection 2022.
2
Mapping wide row crops with video sequences acquired from a tractor moving at treatment speed.利用以处理速度行驶的拖拉机获取的视频序列对宽行作物进行制图。
Sensors (Basel). 2011;11(7):7095-109. doi: 10.3390/s110707095. Epub 2011 Jul 11.
3
A computational approach to edge detection.一种基于计算的边缘检测方法。
IEEE Trans Pattern Anal Mach Intell. 1986 Jun;8(6):679-98.