Andújar Dionisio, Dorado José, Fernández-Quintanilla César, Ribeiro Angela
Center for Automation and Robotics, Spanish National Research Council, CSIC-UPM, Arganda del Rey, Madrid 28500, Spain.
Institute of Agricultural Sciences, Spanish National Research Council, CSIC, Madrid 28006, Spain.
Sensors (Basel). 2016 Jun 25;16(7):972. doi: 10.3390/s16070972.
The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them.
深度相机在精准农业中的应用日益广泛。这类传感器已被用于多种作物的植株结构特征分析。然而,在农田环境中,区分诸如杂草等小型植株仍是一项挑战。新型微软Kinect v2传感器的改进之处在于能够捕捉植株的细节。采用高度选择和RGB(红、绿、蓝)分割的双重方法可以区分作物、杂草和土壤。本文通过使用Kinect融合算法,在实际田间条件下对杂草丛生的玉米作物进行三维点云重建,探索了该传感器的应用潜力。处理后的模型在三维深度图像与从实际结构参数获得的土壤测量数据之间显示出良好的一致性。通过对相连面的高度选择在样本中识别出玉米植株,其与玉米生物量的相关性为0.77。杂草较低的高度使得需要利用RGB识别将其与样本的土壤微地形区分开来,其与杂草生物量的相关性良好,达到了0.83。此外,杂草密度与体积测量结果显示出良好的相关性。典型判别分析在单子叶植物和双子叶植物分类方面显示出有前景的结果。这些结果表明,使用Kinect方法估计体积可能是一种用于作物状况判定和杂草检测的高精度方法。通过构建一个集成这些传感器的新系统并开发算法来妥善处理它们提供的信息,它为农业生产过程的自动化提供了多种可能性。