Key Laboratory of Modern Precision Agriculture System Integration Research, Ministry of Education, China Agricultural University, Beijing 100083, China.
Key Laboratory of Agricultural Information Acquisition Technology, Ministry of Agriculture, and Rural Affairs, China Agricultural University, Beijing 100083, China.
Sensors (Basel). 2020 Jul 10;20(14):3848. doi: 10.3390/s20143848.
The heart girth parameter is an important indicator reflecting the growth and development of pigs that provides critical guidance for the optimization of healthy pig breeding. To overcome the heavy workloads and poor adaptability of traditional measurement methods currently used in pig breeding, this paper proposes an automated pig heart girth measurement method using two Kinect depth sensors. First, a two-view pig depth image acquisition platform is established for data collection; the two-view point clouds after preprocessing are registered and fused by feature-based improved 4-Point Congruent Set (4PCS) method. Second, the fused point cloud is pose-normalized, and the axillary contour is used to automatically extract the heart girth measurement point. Finally, this point is taken as the starting point to intercept the circumferential perpendicular to the ground from the pig point cloud, and the complete heart girth point cloud is obtained by mirror symmetry. The heart girth is measured along this point cloud using the shortest path method. Using the proposed method, experiments were conducted on two-view data from 26 live pigs. The results showed that the heart girth measurement absolute errors were all less than 4.19 cm, and the average relative error was 2.14%, which indicating a high accuracy and efficiency of this method.
心围参数是反映猪只生长发育的重要指标,为优化健康猪只养殖提供了重要指导。为了克服当前猪只养殖中传统测量方法工作量大、适应性差的问题,本文提出了一种使用两个 Kinect 深度传感器的自动化猪心围测量方法。首先,建立了一个用于数据采集的两视图猪深度图像采集平台;通过基于特征的改进 4 点一致集(4PCS)方法对预处理后的两视图点云进行配准和融合。其次,对融合后的点云进行位姿归一化,利用腋窝轮廓自动提取心围测量点。最后,以此点为起点,从猪点云中截取垂直于地面的周向截面,通过镜像对称得到完整的心围点云。沿着该点云使用最短路径方法测量心围。通过对 26 头活猪的两视图数据进行实验,结果表明,心围测量的绝对误差均小于 4.19cm,平均相对误差为 2.14%,表明该方法具有较高的准确性和效率。