Oh Sang-Hyon, Park Hee-Mun, Park Jin-Hyun
Division of Animal Science, College of Agriculture and Life Science, Gyeongsang National University, Jinju 52725, Korea.
School of Mechatronics Engineering, Engineering College of Convergence Technology, Gyeongsang National University, Jinju 52725, Korea.
J Anim Sci Technol. 2023 May;65(3):638-651. doi: 10.5187/jast.2023.e41. Epub 2023 May 31.
The objective of this study was to quantitatively estimate the level of grazing area damage in outdoor free-range pig production using a Unmanned Aerial Vehicles (UAV) with an RGB image sensor. Ten corn field images were captured by a UAV over approximately two weeks, during which gestating sows were allowed to graze freely on the corn field measuring 100 × 50 m. The images were corrected to a bird's-eye view, and then divided into 32 segments and sequentially inputted into the YOLOv4 detector to detect the corn images according to their condition. The 43 raw training images selected randomly out of 320 segmented images were flipped to create 86 images, and then these images were further augmented by rotating them in 5-degree increments to create a total of 6,192 images. The increased 6,192 images are further augmented by applying three random color transformations to each image, resulting in 24,768 datasets. The occupancy rate of corn in the field was estimated efficiently using You Only Look Once (YOLO). As of the first day of observation (day 2), it was evident that almost all the corn had disappeared by the ninth day. When grazing 20 sows in a 50 × 100 m cornfield (250 m/sow), it appears that the animals should be rotated to other grazing areas to protect the cover crop after at least five days. In agricultural technology, most of the research using machine and deep learning is related to the detection of fruits and pests, and research on other application fields is needed. In addition, large-scale image data collected by experts in the field are required as training data to apply deep learning. If the data required for deep learning is insufficient, a large number of data augmentation is required.
本研究的目的是使用配备RGB图像传感器的无人机对户外自由放养的生猪生产中放牧区域的受损程度进行定量评估。在大约两周的时间里,无人机拍摄了10张玉米田图像,在此期间,妊娠母猪被允许在一块100×50米的玉米田上自由放牧。这些图像被校正为鸟瞰图,然后被分成32个部分,并依次输入到YOLOv4检测器中,根据玉米的状况检测玉米图像。从320个分割图像中随机选择43张原始训练图像进行翻转,生成86张图像,然后将这些图像以5度的增量旋转进行进一步增强,总共生成6192张图像。对增加的6192张图像中的每张图像应用三种随机颜色变换进行进一步增强,得到24768个数据集。使用You Only Look Once(YOLO)有效地估计了田间玉米的占有率。截至观察的第一天(第2天),很明显到第9天几乎所有的玉米都消失了。在一块50×100米的玉米田(每头母猪250平方米)中放牧20头母猪时,似乎至少五天后就应该将动物转移到其他放牧区域,以保护覆盖作物。在农业技术领域,大多数使用机器和深度学习的研究都与水果和害虫的检测有关,还需要对其他应用领域进行研究。此外,需要该领域专家收集的大规模图像数据作为训练数据来应用深度学习。如果深度学习所需的数据不足,则需要大量的数据增强。