• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于微感兴趣区域和对数变换的穗状作物行检测

Tasseled Crop Rows Detection Based on Micro-Region of Interest and Logarithmic Transformation.

作者信息

Yang Zhenling, Yang Yang, Li Chaorong, Zhou Yang, Zhang Xiaoshuang, Yu Yang, Liu Dan

机构信息

School of Engineering, Anhui Agricultural University, Hefei, China.

Institute of Artificial Intelligence, Hefei Comprehensive Nation Science Center, Hefei, China.

出版信息

Front Plant Sci. 2022 Jun 27;13:916474. doi: 10.3389/fpls.2022.916474. eCollection 2022.

DOI:10.3389/fpls.2022.916474
PMID:35832229
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9272774/
Abstract

Machine vision-based navigation in the maize field is significant for intelligent agriculture. Therefore, precision detection of the tasseled crop rows for navigation of agricultural machinery with an accurate and fast method remains an open question. In this article, we propose a new crop rows detection method at the tasseling stage of maize fields for agrarian machinery navigation. The whole work is achieved mainly through image augment and feature point extraction by micro-region of interest (micro-ROI). In the proposed method, we first augment the distinction between the tassels and background by the logarithmic transformation in RGB color space, and then the image is transformed to hue-saturation-value (HSV) space to extract the tassels. Second, the ROI is approximately selected and updated using the bounding box until the multiple-region of interest (multi-ROI) is determined. We further propose a feature points extraction method based on micro-ROI and the feature points are used to calculate the crop rows detection lines. Finally, the bisector of the acute angle formed by the two detection lines is used as the field navigation line. The experimental results show that the algorithm proposed has good robustness and can accurately detect crop rows. Compared with other existing methods, our method's accuracy and real-time performance have improved by about 5 and 62.3%, respectively, which can meet the accuracy and real-time requirements of agricultural vehicles' navigation in maize fields.

摘要

基于机器视觉的玉米田导航对智能农业具有重要意义。因此,采用准确快速的方法对用于农业机械导航的抽雄作物行进行精确检测仍然是一个悬而未决的问题。在本文中,我们提出了一种用于农业机械导航的玉米田抽雄期作物行检测新方法。整个工作主要通过图像增强和基于微感兴趣区域(micro-ROI)的特征点提取来实现。在所提出的方法中,我们首先通过RGB颜色空间中的对数变换增强雄穗与背景之间的差异,然后将图像转换到色调-饱和度-明度(HSV)空间以提取雄穗。其次,使用边界框近似选择并更新感兴趣区域(ROI),直到确定多个感兴趣区域(multi-ROI)。我们进一步提出了一种基于微ROI的特征点提取方法,并使用这些特征点来计算作物行检测线。最后,将两条检测线形成的锐角的平分线用作田间导航线。实验结果表明,所提出的算法具有良好的鲁棒性,能够准确检测作物行。与其他现有方法相比,我们的方法的准确率和实时性能分别提高了约5%和62.3%,能够满足农业车辆在玉米田导航的准确性和实时性要求。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/7108f52bb1ae/fpls-13-916474-g0019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/4d34ee00b76d/fpls-13-916474-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/3a12453590cd/fpls-13-916474-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/691e251ccd8b/fpls-13-916474-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/e9a24f9ac421/fpls-13-916474-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/aeb8da2acf1e/fpls-13-916474-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/595e2e2ecee8/fpls-13-916474-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/8a846ec1d667/fpls-13-916474-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/3a72a1002ef1/fpls-13-916474-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/f85a2d07ab9e/fpls-13-916474-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/3bdc171a982a/fpls-13-916474-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/8f549cf464f8/fpls-13-916474-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/3a7d3a8d1ccb/fpls-13-916474-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/87e8e859c5ac/fpls-13-916474-g0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/0e4eea5b2340/fpls-13-916474-g0014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/dac083811c7e/fpls-13-916474-g0015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/72122440ff7d/fpls-13-916474-g0016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/f4ed356dba3b/fpls-13-916474-g0017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/e7caca0ed2bb/fpls-13-916474-g0018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/7108f52bb1ae/fpls-13-916474-g0019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/4d34ee00b76d/fpls-13-916474-g0001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/3a12453590cd/fpls-13-916474-g0002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/691e251ccd8b/fpls-13-916474-g0003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/e9a24f9ac421/fpls-13-916474-g0004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/aeb8da2acf1e/fpls-13-916474-g0005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/595e2e2ecee8/fpls-13-916474-g0006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/8a846ec1d667/fpls-13-916474-g0007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/3a72a1002ef1/fpls-13-916474-g0008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/f85a2d07ab9e/fpls-13-916474-g0009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/3bdc171a982a/fpls-13-916474-g0010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/8f549cf464f8/fpls-13-916474-g0011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/3a7d3a8d1ccb/fpls-13-916474-g0012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/87e8e859c5ac/fpls-13-916474-g0013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/0e4eea5b2340/fpls-13-916474-g0014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/dac083811c7e/fpls-13-916474-g0015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/72122440ff7d/fpls-13-916474-g0016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/f4ed356dba3b/fpls-13-916474-g0017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/e7caca0ed2bb/fpls-13-916474-g0018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/b849/9272774/7108f52bb1ae/fpls-13-916474-g0019.jpg

相似文献

1
Tasseled Crop Rows Detection Based on Micro-Region of Interest and Logarithmic Transformation.基于微感兴趣区域和对数变换的穗状作物行检测
Front Plant Sci. 2022 Jun 27;13:916474. doi: 10.3389/fpls.2022.916474. eCollection 2022.
2
Improved Real-Time Semantic Segmentation Network Model for Crop Vision Navigation Line Detection.用于作物视觉导航线检测的改进实时语义分割网络模型
Front Plant Sci. 2022 Jun 2;13:898131. doi: 10.3389/fpls.2022.898131. eCollection 2022.
3
Inter-row navigation line detection for cotton with broken rows.用于断行棉花的行间导航线检测
Plant Methods. 2022 Jul 2;18(1):90. doi: 10.1186/s13007-022-00913-y.
4
Improving the maize crop row navigation line recognition method of YOLOX.改进YOLOX的玉米作物行导航线识别方法。
Front Plant Sci. 2024 Mar 28;15:1338228. doi: 10.3389/fpls.2024.1338228. eCollection 2024.
5
Adaptive Multi-ROI Agricultural Robot Navigation Line Extraction Based on Image Semantic Segmentation.基于图像语义分割的自适应多 ROI 农业机器人导航线路提取。
Sensors (Basel). 2022 Oct 11;22(20):7707. doi: 10.3390/s22207707.
6
Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields.基于自适应多感兴趣区域的草莓田自主作物行导航。
Sensors (Basel). 2020 Sep 14;20(18):5249. doi: 10.3390/s20185249.
7
Navigation path extraction for inter-row robots in Panax notoginseng shade house based on Im-YOLOv5s.基于Im-YOLOv5s的三七荫棚内行间机器人导航路径提取
Front Plant Sci. 2023 Oct 17;14:1246717. doi: 10.3389/fpls.2023.1246717. eCollection 2023.
8
Maize tassels detection: a benchmark of the state of the art.玉米雄穗检测:当前技术水平的基准
Plant Methods. 2020 Aug 8;16:108. doi: 10.1186/s13007-020-00651-z. eCollection 2020.
9
A SVM and SLIC Based Detection Method for Paddy Field Boundary Line.基于 SVM 和 SLIC 的稻田边界线检测方法。
Sensors (Basel). 2020 May 3;20(9):2610. doi: 10.3390/s20092610.
10
A Rust Extraction and Evaluation Method for Navigation Buoys Based on Improved U-Net and Hue, Saturation, and Value.一种基于改进U-Net和色相、饱和度、明度的导航浮标锈蚀提取与评估方法
Sensors (Basel). 2023 Oct 24;23(21):8670. doi: 10.3390/s23218670.

引用本文的文献

1
Real-time crop row detection using computer vision- application in agricultural robots.利用计算机视觉进行实时作物行检测——在农业机器人中的应用
Front Artif Intell. 2024 Oct 30;7:1435686. doi: 10.3389/frai.2024.1435686. eCollection 2024.

本文引用的文献

1
Supervised Learning in Multilayer Spiking Neural Networks With Spike Temporal Error Backpropagation.监督学习在基于尖峰时间误差反向传播的多层尖峰神经网络中的应用。
IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):10141-10153. doi: 10.1109/TNNLS.2022.3164930. Epub 2023 Nov 30.
2
Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks.深度脉冲神经网络中用于反向传播的整流线性突触后电位函数
IEEE Trans Neural Netw Learn Syst. 2022 May;33(5):1947-1958. doi: 10.1109/TNNLS.2021.3110991. Epub 2022 May 2.
3
Machine Vision Systems in Precision Agriculture for Crop Farming.
用于作物种植的精准农业中的机器视觉系统。
J Imaging. 2019 Dec 7;5(12):89. doi: 10.3390/jimaging5120089.
4
Autonomous Crop Row Guidance Using Adaptive Multi-ROI in Strawberry Fields.基于自适应多感兴趣区域的草莓田自主作物行导航。
Sensors (Basel). 2020 Sep 14;20(18):5249. doi: 10.3390/s20185249.
5
Towards spike-based machine intelligence with neuromorphic computing.迈向基于尖峰的机器智能的神经形态计算。
Nature. 2019 Nov;575(7784):607-617. doi: 10.1038/s41586-019-1677-2. Epub 2019 Nov 27.
6
All-optical spiking neurosynaptic networks with self-learning capabilities.具有自学习能力的全光尖峰神经突触网络。
Nature. 2019 May;569(7755):208-214. doi: 10.1038/s41586-019-1157-8. Epub 2019 May 8.
7
A Highly Effective and Robust Membrane Potential-Driven Supervised Learning Method for Spiking Neurons.一种高效稳健的基于膜电位的尖峰神经元监督学习方法。
IEEE Trans Neural Netw Learn Syst. 2019 Jan;30(1):123-137. doi: 10.1109/TNNLS.2018.2833077. Epub 2018 May 28.
8
Potential Applications and Limitations of Electronic Nose Devices for Plant Disease Diagnosis.电子鼻设备在植物病害诊断中的潜在应用及局限性。
Sensors (Basel). 2017 Nov 11;17(11):2596. doi: 10.3390/s17112596.
9
Hough forests for object detection, tracking, and action recognition.用于目标检测、跟踪和动作识别的 Hough 森林。
IEEE Trans Pattern Anal Mach Intell. 2011 Nov;33(11):2188-202. doi: 10.1109/TPAMI.2011.70.