Department of Biosystems Engineering, College of Agriculture and Life Sciences, Kangwon National University, Chuncheon 24341, Korea.
Interdisciplinary Program in Smart Agriculture, College of Agriculture and Life Sciences, Kangwon National University, Chuncheon 24341, Korea.
Sensors (Basel). 2022 Feb 12;22(4):1423. doi: 10.3390/s22041423.
Unmanned aerial vehicle-based remote sensing technology has recently been widely applied to crop monitoring due to the rapid development of unmanned aerial vehicles, and these technologies have considerable potential in smart agriculture applications. Field phenotyping using remote sensing is mostly performed using unmanned aerial vehicles equipped with RGB cameras or multispectral cameras. For accurate field phenotyping for precision agriculture, images taken from multiple perspectives need to be simultaneously collected, and phenotypic measurement errors may occur due to the movement of the drone and plants during flight. In this study, to minimize measurement error and improve the digital surface model, we proposed a collaborative driving system that allows multiple UAVs to simultaneously acquire images from different viewpoints. An integrated navigation system based on MAVSDK is configured for the attitude control and position control of unmanned aerial vehicles. Based on the leader-follower-based swarm driving algorithm and a long-range wireless network system, the follower drone cooperates with the leader drone to maintain a constant speed, direction, and image overlap ratio, and to maintain a rank to improve their phenotyping. A collision avoidance algorithm was developed because different UAVs can collide due to external disturbance (wind) when driving in groups while maintaining a rank. To verify and optimize the flight algorithm developed in this study in a virtual environment, a GAZEBO-based simulation environment was established. Based on the algorithm that has been verified and optimized in the previous simulation environment, some unmanned aerial vehicles were flown in the same flight path in a real field, and the simulation and the real field were compared. As a result of the comparative experiment, the simulated flight accuracy (RMSE) was 0.36 m and the actual field flight accuracy was 0.46 m, showing flight accuracy like that of a commercial program.
基于无人机的遥感技术由于无人机的快速发展,最近已广泛应用于作物监测,并且这些技术在智能农业应用中有很大的潜力。使用遥感进行田间表型分析主要是使用配备 RGB 相机或多光谱相机的无人机进行的。为了实现精准农业的精确田间表型分析,需要同时采集来自多个视角的图像,并且由于无人机和植物在飞行过程中的移动,可能会发生表型测量误差。在这项研究中,为了最小化测量误差并提高数字表面模型,我们提出了一种协同驾驶系统,允许多架无人机同时从不同视角采集图像。基于 MAVSDK 的集成导航系统用于控制无人机的姿态和位置。基于基于领导者-跟随者的群体驾驶算法和远程无线网络系统,跟随无人机与领导者无人机协作,以保持恒定的速度、方向和图像重叠比,并保持排名以提高其表型。由于在群体驾驶时不同的无人机可能会因外部干扰(风)而发生碰撞,因此开发了一种避碰算法。为了在虚拟环境中验证和优化本研究中开发的飞行算法,建立了一个基于 GAZEBO 的模拟环境。基于在前一个模拟环境中已经验证和优化的算法,一些无人机在同一个飞行航线上进行了实际飞行,并对模拟和实际场地进行了比较。通过对比实验,模拟飞行的精度(RMSE)为 0.36 米,实际野外飞行的精度为 0.46 米,表现出与商业程序相当的飞行精度。