• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于密集石榴园自主农业机器人的基于视觉与二维激光雷达融合的导航线提取

Vision and 2D LiDAR Fusion-Based Navigation Line Extraction for Autonomous Agricultural Robots in Dense Pomegranate Orchards.

作者信息

Shi Zhikang, Bai Ziwen, Yi Kechuan, Qiu Baijing, Dong Xiaoya, Wang Qingqing, Jiang Chunxia, Zhang Xinwei, Huang Xin

机构信息

College of Intelligent Manufacturing, Anhui Science and Technology University, Chuzhou 239000, China.

Key Laboratory of Plant Protection Engineering, Ministry of Agriculture and Rural Affairs, Jiangsu University, Zhenjiang 212013, China.

出版信息

Sensors (Basel). 2025 Sep 2;25(17):5432. doi: 10.3390/s25175432.

DOI:10.3390/s25175432
PMID:40942859
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12431225/
Abstract

To address the insufficient accuracy of traditional single-sensor navigation methods in dense planting environments of pomegranate orchards, this paper proposes a vision and LiDAR fusion-based navigation line extraction method for orchard environments. The proposed method integrates a YOLOv8-ResCBAM trunk detection model, a reverse ray projection fusion algorithm, and geometric constraint-based navigation line fitting techniques. The object detection model enables high-precision real-time detection of pomegranate tree trunks. A reverse ray projection algorithm is proposed to convert pixel coordinates from visual detection into three-dimensional rays and compute their intersections with LiDAR scanning planes, achieving effective association between visual and LiDAR data. Finally, geometric constraints are introduced to improve the RANSAC algorithm for navigation line fitting, combined with Kalman filtering techniques to reduce navigation line fluctuations. Field experiments demonstrate that the proposed fusion-based navigation method improves navigation accuracy over single-sensor methods and semantic-segmentation methods, reducing the average lateral error to 5.2 cm, yielding an average lateral error RMS of 6.6 cm, and achieving a navigation success rate of 95.4%. These results validate the effectiveness of the vision and 2D LiDAR fusion-based approach in complex orchard environments and provide a viable route toward autonomous navigation for orchard robots.

摘要

针对传统单传感器导航方法在石榴园密植环境中精度不足的问题,本文提出了一种基于视觉与激光雷达融合的果园环境导航线提取方法。该方法集成了YOLOv8-ResCBAM树干检测模型、反向光线投影融合算法和基于几何约束的导航线拟合技术。目标检测模型能够对石榴树树干进行高精度实时检测。提出了一种反向光线投影算法,将视觉检测中的像素坐标转换为三维光线,并计算其与激光雷达扫描平面的交点,实现视觉数据与激光雷达数据的有效关联。最后,引入几何约束改进用于导航线拟合的RANSAC算法,并结合卡尔曼滤波技术减少导航线波动。田间实验表明,所提出的基于融合的导航方法比单传感器方法和语义分割方法提高了导航精度,将平均横向误差降低到5.2厘米,平均横向误差RMS为6.6厘米,导航成功率达到95.4%。这些结果验证了基于视觉与二维激光雷达融合的方法在复杂果园环境中的有效性,并为果园机器人自主导航提供了一条可行途径。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/49f7a22893fe/sensors-25-05432-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/a8973154af95/sensors-25-05432-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/12ead4ec3ad1/sensors-25-05432-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/1cd8876edac2/sensors-25-05432-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/4db46a621d8f/sensors-25-05432-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/e8556965b75a/sensors-25-05432-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/3db4fde60c55/sensors-25-05432-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/ea401080126d/sensors-25-05432-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/c7a50fbd3b5c/sensors-25-05432-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/1d354ba7d705/sensors-25-05432-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/a6a86c54acb3/sensors-25-05432-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/57aa3afe9ae2/sensors-25-05432-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/8a3c6ec32c09/sensors-25-05432-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/2cefa5ddf62b/sensors-25-05432-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/d9a0efc7028e/sensors-25-05432-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/4fd733874db0/sensors-25-05432-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/37b193471091/sensors-25-05432-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/dd2a607b40e6/sensors-25-05432-g017a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/49f7a22893fe/sensors-25-05432-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/a8973154af95/sensors-25-05432-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/12ead4ec3ad1/sensors-25-05432-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/1cd8876edac2/sensors-25-05432-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/4db46a621d8f/sensors-25-05432-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/e8556965b75a/sensors-25-05432-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/3db4fde60c55/sensors-25-05432-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/ea401080126d/sensors-25-05432-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/c7a50fbd3b5c/sensors-25-05432-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/1d354ba7d705/sensors-25-05432-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/a6a86c54acb3/sensors-25-05432-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/57aa3afe9ae2/sensors-25-05432-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/8a3c6ec32c09/sensors-25-05432-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/2cefa5ddf62b/sensors-25-05432-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/d9a0efc7028e/sensors-25-05432-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/4fd733874db0/sensors-25-05432-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/37b193471091/sensors-25-05432-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/dd2a607b40e6/sensors-25-05432-g017a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/722e/12431225/49f7a22893fe/sensors-25-05432-g018.jpg

相似文献

1
Vision and 2D LiDAR Fusion-Based Navigation Line Extraction for Autonomous Agricultural Robots in Dense Pomegranate Orchards.用于密集石榴园自主农业机器人的基于视觉与二维激光雷达融合的导航线提取
Sensors (Basel). 2025 Sep 2;25(17):5432. doi: 10.3390/s25175432.
2
A long-term localization and mapping system for autonomous inspection robots in large-scale environments using 3D LiDAR sensors.一种用于大型环境中自主巡检机器人的基于3D激光雷达传感器的长期定位与建图系统。
PLoS One. 2025 Jul 31;20(7):e0328169. doi: 10.1371/journal.pone.0328169. eCollection 2025.
3
OrchardQuant-3D: combining drone and LiDAR to perform scalable 3D phenotyping for characterising key canopy and floral traits in fruit orchards.
Plant Biotechnol J. 2025 Jul 23. doi: 10.1111/pbi.70229.
4
A Dynamic Kalman Filtering Method for Multi-Object Fruit Tracking and Counting in Complex Orchards.一种用于复杂果园中多目标水果跟踪与计数的动态卡尔曼滤波方法。
Sensors (Basel). 2025 Jul 2;25(13):4138. doi: 10.3390/s25134138.
5
Integrated neural network framework for multi-object detection and recognition using UAV imagery.用于使用无人机图像进行多目标检测与识别的集成神经网络框架。
Front Neurorobot. 2025 Jul 30;19:1643011. doi: 10.3389/fnbot.2025.1643011. eCollection 2025.
6
Research on Disaster Environment Map Fusion Construction and Reinforcement Learning Navigation Technology Based on Air-Ground Collaborative Multi-Heterogeneous Robot Systems.基于空地协同多异构机器人系统的灾害环境地图融合构建与强化学习导航技术研究
Sensors (Basel). 2025 Aug 12;25(16):4988. doi: 10.3390/s25164988.
7
Singular Value Decomposition (SVD) Method for LiDAR and Camera Sensor Fusion and Pattern Matching Algorithm.用于激光雷达和相机传感器融合的奇异值分解(SVD)方法及模式匹配算法。
Sensors (Basel). 2025 Jun 21;25(13):3876. doi: 10.3390/s25133876.
8
Enhancing LiDAR-IMU SLAM for Infrastructure Monitoring via Dynamic Coplanarity Constraints and Joint Observation.通过动态共面性约束和联合观测增强用于基础设施监测的激光雷达-惯性测量单元同步定位与地图构建技术
Sensors (Basel). 2025 Aug 27;25(17):5330. doi: 10.3390/s25175330.
9
Multi-layer robotic controller for enhancing the safety of mobile robot navigation in human-centered indoor environments.用于增强以人为中心的室内环境中移动机器人导航安全性的多层机器人控制器。
Front Robot AI. 2025 Jul 31;12:1629931. doi: 10.3389/frobt.2025.1629931. eCollection 2025.
10
Optimization of a Navigation System for Autonomous Charging of Intelligent Vehicles Based on the Bidirectional A* Algorithm and YOLOv11n Model.基于双向A*算法和YOLOv11n模型的智能车辆自主充电导航系统优化
Sensors (Basel). 2025 Jul 24;25(15):4577. doi: 10.3390/s25154577.

本文引用的文献

1
An autonomous navigation method for orchard mobile robots based on octree 3D point cloud optimization.一种基于八叉树三维点云优化的果园移动机器人自主导航方法。
Front Plant Sci. 2025 Jan 7;15:1510683. doi: 10.3389/fpls.2024.1510683. eCollection 2024.
2
Environmental and geographical conditions influence color, physical properties, and physiochemical composition of pomegranate fruits.环境和地理条件影响石榴果实的颜色、物理特性和理化组成。
Sci Rep. 2023 Sep 18;13(1):15447. doi: 10.1038/s41598-023-42749-z.
3
Study on Multi-Heterogeneous Sensor Data Fusion Method Based on Millimeter-Wave Radar and Camera.
基于毫米波雷达和相机的多异类传感器数据融合方法研究。
Sensors (Basel). 2023 Jun 29;23(13):6044. doi: 10.3390/s23136044.
4
Research on orchard navigation method based on fusion of 3D SLAM and point cloud positioning.基于3D SLAM与点云定位融合的果园导航方法研究
Front Plant Sci. 2023 Jun 26;14:1207742. doi: 10.3389/fpls.2023.1207742. eCollection 2023.
5
Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection.利用激光雷达进行树干检测的果园作业自主喷雾机器人导航。
Sensors (Basel). 2023 May 16;23(10):4808. doi: 10.3390/s23104808.
6
Real time object detection using LiDAR and camera fusion for autonomous driving.基于激光雷达和相机融合的自动驾驶实时目标检测。
Sci Rep. 2023 May 17;13(1):8056. doi: 10.1038/s41598-023-35170-z.
7
Interpretation and Transformation of Intrinsic Camera Parameters Used in Photogrammetry and Computer Vision.摄影测量与计算机视觉中使用的内参相机参数的解释与转换。
Sensors (Basel). 2022 Dec 7;22(24):9602. doi: 10.3390/s22249602.
8
Design and development of orchard autonomous navigation spray system.果园自主导航喷雾系统的设计与开发
Front Plant Sci. 2022 Aug 1;13:960686. doi: 10.3389/fpls.2022.960686. eCollection 2022.
9
A Review of Computer Vision-Based Structural Deformation Monitoring in Field Environments.基于计算机视觉的野外环境结构变形监测综述。
Sensors (Basel). 2022 May 16;22(10):3789. doi: 10.3390/s22103789.
10
A Loosely Coupled Extended Kalman Filter Algorithm for Agricultural Scene-Based Multi-Sensor Fusion.一种基于农业场景的多传感器融合的松散耦合扩展卡尔曼滤波算法。
Front Plant Sci. 2022 Apr 25;13:849260. doi: 10.3389/fpls.2022.849260. eCollection 2022.