文献检索文档翻译深度研究
Suppr Zotero 插件Zotero 插件
邀请有礼套餐&价格历史记录

新学期,新优惠

限时优惠:9月1日-9月22日

30天高级会员仅需29元

1天体验卡首发特惠仅需5.99元

了解详情
不再提醒
插件&应用
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
高级版
套餐订阅购买积分包
AI 工具
文献检索文档翻译深度研究
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2025

A Lightweight YOLOv4-Based Forestry Pest Detection Method Using Coordinate Attention and Feature Fusion.

作者信息

Zha Mingfeng, Qian Wenbin, Yi Wenlong, Hua Jing

机构信息

School of Software, Jiangxi Agricultural University, Nanchang 330045, China.

出版信息

Entropy (Basel). 2021 Nov 27;23(12):1587. doi: 10.3390/e23121587.


DOI:10.3390/e23121587
PMID:34945892
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8700145/
Abstract

Traditional pest detection methods are challenging to use in complex forestry environments due to their low accuracy and speed. To address this issue, this paper proposes the YOLOv4_MF model. The YOLOv4_MF model utilizes MobileNetv2 as the feature extraction block and replaces the traditional convolution with depth-wise separated convolution to reduce the model parameters. In addition, the coordinate attention mechanism was embedded in MobileNetv2 to enhance feature information. A symmetric structure consisting of a three-layer spatial pyramid pool is presented, and an improved feature fusion structure was designed to fuse the target information. For the loss function, focal loss was used instead of cross-entropy loss to enhance the network's learning of small targets. The experimental results showed that the YOLOv4_MF model has 4.24% higher mAP, 4.37% higher precision, and 6.68% higher recall than the YOLOv4 model. The size of the proposed model was reduced to 1/6 of that of YOLOv4. Moreover, the proposed algorithm achieved 38.62% mAP with respect to some state-of-the-art algorithms on the COCO dataset.

摘要
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/42609cc21e70/entropy-23-01587-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/208430cd1a34/entropy-23-01587-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/10b11e3f9802/entropy-23-01587-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/e03bdca84f2a/entropy-23-01587-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/b361caac7ef9/entropy-23-01587-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/2be64e10f19e/entropy-23-01587-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/491e16e542e3/entropy-23-01587-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/e7a31f5c018f/entropy-23-01587-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/81f8795a1f26/entropy-23-01587-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/9100a9421ede/entropy-23-01587-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/ae442dc617b7/entropy-23-01587-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/865f21dae0c1/entropy-23-01587-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/309b0e95350c/entropy-23-01587-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/9313e52124c9/entropy-23-01587-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/42609cc21e70/entropy-23-01587-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/208430cd1a34/entropy-23-01587-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/10b11e3f9802/entropy-23-01587-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/e03bdca84f2a/entropy-23-01587-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/b361caac7ef9/entropy-23-01587-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/2be64e10f19e/entropy-23-01587-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/491e16e542e3/entropy-23-01587-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/e7a31f5c018f/entropy-23-01587-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/81f8795a1f26/entropy-23-01587-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/9100a9421ede/entropy-23-01587-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/ae442dc617b7/entropy-23-01587-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/865f21dae0c1/entropy-23-01587-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/309b0e95350c/entropy-23-01587-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/9313e52124c9/entropy-23-01587-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d9e6/8700145/42609cc21e70/entropy-23-01587-g014.jpg

相似文献

[1]
A Lightweight YOLOv4-Based Forestry Pest Detection Method Using Coordinate Attention and Feature Fusion.

Entropy (Basel). 2021-11-27

[2]
Lightweight Helmet Detection Algorithm Using an Improved YOLOv4.

Sensors (Basel). 2023-1-21

[3]
Precision Detection of Dense Plums in Orchards Using the Improved YOLOv4 Model.

Front Plant Sci. 2022-3-11

[4]
Improved YOLOv4 recognition algorithm for pitaya based on coordinate attention and combinational convolution.

Front Plant Sci. 2022-10-18

[5]
A novel algorithm for small object detection based on YOLOv4.

PeerJ Comput Sci. 2023-3-22

[6]
Towards High Accuracy Pedestrian Detection on Edge GPUs.

Sensors (Basel). 2022-8-10

[7]
Combination of UAV and Raspberry Pi 4B: Airspace detection of red imported fire ant nests using an improved YOLOv4 model.

Math Biosci Eng. 2022-9-15

[8]
Detection of Pine Wilt Nematode from Drone Images Using UAV.

Sensors (Basel). 2022-6-22

[9]
Research on Pedestrian Detection Algorithm Based on MobileNet-YoLo.

Comput Intell Neurosci. 2022

[10]
E-YOLOv4-tiny: a traffic sign detection algorithm for urban road scenarios.

Front Neurorobot. 2023-7-18

引用本文的文献

[1]
A WAD-YOLOv8-based method for classroom student behavior detection.

Sci Rep. 2025-3-20

[2]
Target detection of helicopter electric power inspection based on the feature embedding convolution model.

PLoS One. 2024

[3]
Research on vehicle detection based on improved YOLOX_S.

Sci Rep. 2023-12-27

[4]
TeaDiseaseNet: multi-scale self-attentive tea disease detection.

Front Plant Sci. 2023-10-11

[5]
ASFL-YOLOX: an adaptive spatial feature fusion and lightweight detection method for insect pests of the Papilionidae family.

Front Plant Sci. 2023-6-14

[6]
GABNet: global attention block for retinal OCT disease classification.

Front Neurosci. 2023-6-2

[7]
Lightweight Helmet Detection Algorithm Using an Improved YOLOv4.

Sensors (Basel). 2023-1-21

[8]
Tomato Pest Recognition Algorithm Based on Improved YOLOv4.

Front Plant Sci. 2022-7-13

[9]
Visual Recognition of Traffic Signs in Natural Scenes Based on Improved RetinaNet.

Entropy (Basel). 2022-1-12

本文引用的文献

[1]
An Enhanced Insect Pest Counter Based on Saliency Map and Improved Non-Maximum Suppression.

Insects. 2021-8-6

[2]
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks.

IEEE Trans Pattern Anal Mach Intell. 2016-6-6

[3]
Spatial Pyramid Pooling in Deep Convolutional Networks for Visual Recognition.

IEEE Trans Pattern Anal Mach Intell. 2015-9

[4]
A tool for developing an automatic insect identification system based on wing outlines.

Sci Rep. 2015-8-7

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

推荐工具

医学文档翻译智能文献检索