Wu Zhenwei, Wang Xinfa, Jia Meng, Liu Minghao, Sun Chengxiu, Wu Chenyang, Wang Jianping
School of Information Engineering, Henan Institute of Science and Technology, Xinxiang, 453003, China.
College of Mechanical and Electrical Engineering, Xinxiang University, Xinxiang, 453003, China.
Sci Rep. 2024 Aug 4;14(1):18019. doi: 10.1038/s41598-024-69106-y.
Accurate, fast and lightweight dense target detection methods are highly important for precision agriculture. To detect dense apricot flowers using drones, we propose an improved dense target detection method based on YOLOv8, named D-YOLOv8. First, we introduce the Dense Feature Pyramid Networks (D-FPN) to enhance the model's ability to extract dense features and Dense Attention Layer (DAL) to focus on dense target areas, which enhances the feature extraction ability of dense areas, suppresses features in irrelevant areas, and improves dense target detection accuracy. Finally, RAW data are used to enhance the dataset, which introduces additional original data into RAW images, further enriching the feature input of dense objects. We perform validation on the CARPK challenge dataset and constructed a dataset. The experimental results show that our proposed D-YOLOv8m achieved 98.37% AP, while the model parameters were only 13.2 million. The improved network can effectively support any task of dense target detection.
准确、快速且轻量级的密集目标检测方法对于精准农业至关重要。为了使用无人机检测密集的杏花,我们提出了一种基于YOLOv8的改进型密集目标检测方法,名为D-YOLOv8。首先,我们引入了密集特征金字塔网络(D-FPN)来增强模型提取密集特征的能力,并引入密集注意力层(DAL)来聚焦密集目标区域,这增强了密集区域的特征提取能力,抑制了无关区域的特征,提高了密集目标检测的准确性。最后,使用原始数据增强数据集,即将额外的原始数据引入原始图像中,进一步丰富了密集物体的特征输入。我们在CARPK挑战数据集和构建的数据集上进行了验证。实验结果表明,我们提出的D-YOLOv8m的平均精度(AP)达到了98.37%,而模型参数仅为1320万。改进后的网络能够有效地支持任何密集目标检测任务。