文献检索文档翻译深度研究
Suppr Zotero 插件Zotero 插件
邀请有礼套餐&价格历史记录

新学期,新优惠

限时优惠:9月1日-9月22日

30天高级会员仅需29元

1天体验卡首发特惠仅需5.99元

了解详情
不再提醒
插件&应用
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
高级版
套餐订阅购买积分包
AI 工具
文献检索文档翻译深度研究
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2025

基于全卷积网络的稻田秧苗期水稻苗和杂草图像分割。

Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields.

机构信息

College of Engineering, South China Agricultural University, Guangzhou, China.

出版信息

PLoS One. 2019 Apr 18;14(4):e0215676. doi: 10.1371/journal.pone.0215676. eCollection 2019.


DOI:10.1371/journal.pone.0215676
PMID:30998770
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6472823/
Abstract

To reduce the cost of production and the pollution of the environment that is due to the overapplication of herbicide in paddy fields, the location information of rice seedlings and weeds must be detected in site-specific weed management (SSWM). With the development of deep learning, a semantic segmentation method with the SegNet that is based on fully convolutional network (FCN) was proposed. In this paper, RGB color images of seedling rice were captured in paddy field, and ground truth (GT) images were obtained by manually labeled the pixels in the RGB images with three separate categories, namely, rice seedlings, background, and weeds. The class weight coefficients were calculated to solve the problem of the unbalance of the number of the classification category. GT images and RGB images were used for data training and data testing. Eighty percent of the samples were randomly selected as the training dataset and 20% of samples were used as the test dataset. The proposed method was compared with a classical semantic segmentation model, namely, FCN, and U-Net models. The average accuracy rate of the SegNet method was 92.7%, whereas the average accuracy rates of the FCN and U-Net methods were 89.5% and 70.8%, respectively. The proposed SegNet method realized higher classification accuracy and could effectively classify the pixels of rice seedlings, background, and weeds in the paddy field images and acquire the positions of their regions.

摘要

为了降低由于稻田过度使用除草剂而导致的生产成本和环境污染,在精准杂草管理(SSWM)中必须检测稻田中秧苗和杂草的位置信息。随着深度学习的发展,提出了一种基于全卷积网络(FCN)的语义分割方法 SegNet。本文在稻田中拍摄了秧苗的 RGB 彩色图像,并通过手动标记 RGB 图像中的像素,得到了具有三个单独类别的地面真实(GT)图像,即秧苗、背景和杂草。计算了类别权重系数以解决分类类别的数量不平衡问题。GT 图像和 RGB 图像用于数据训练和数据测试。随机选择 80%的样本作为训练数据集,20%的样本作为测试数据集。将所提出的方法与经典语义分割模型 FCN 和 U-Net 模型进行了比较。SegNet 方法的平均准确率为 92.7%,而 FCN 和 U-Net 方法的平均准确率分别为 89.5%和 70.8%。所提出的 SegNet 方法实现了更高的分类准确率,可以有效地对稻田图像中的秧苗、背景和杂草像素进行分类,并获取它们的区域位置。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/de8854c00b4a/pone.0215676.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/d3fbdd2eff3c/pone.0215676.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/0480d2f23161/pone.0215676.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/d11f77da1416/pone.0215676.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/48bc56310e39/pone.0215676.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/93dd643d353c/pone.0215676.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/de8854c00b4a/pone.0215676.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/d3fbdd2eff3c/pone.0215676.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/0480d2f23161/pone.0215676.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/d11f77da1416/pone.0215676.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/48bc56310e39/pone.0215676.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/93dd643d353c/pone.0215676.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d0de/6472823/de8854c00b4a/pone.0215676.g006.jpg

相似文献

[1]
Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields.

PLoS One. 2019-4-18

[2]
Weed target detection at seedling stage in paddy fields based on YOLOX.

PLoS One. 2023

[3]
A Weakly Supervised Semantic Segmentation Model of Maize Seedlings and Weed Images Based on Scrawl Labels.

Sensors (Basel). 2023-12-15

[4]
Weed and Corn Seedling Detection in Field Based on Multi Feature Fusion and Support Vector Machine.

Sensors (Basel). 2020-12-31

[5]
Skin lesion segmentation in dermoscopy images via deep full resolution convolutional networks.

Comput Methods Programs Biomed. 2018-5-19

[6]
The effect of periphyton on seed germination and seedling growth of rice (Oryza sativa) in paddy area.

Sci Total Environ. 2016-8-5

[7]
Evaluation of different deep convolutional neural networks for detection of broadleaf weed seedlings in wheat.

Pest Manag Sci. 2022-2

[8]
Evaluation of Deep Neural Networks for Semantic Segmentation of Prostate in T2W MRI.

Sensors (Basel). 2020-6-3

[9]
Enhanced photosynthesis endows seedling growth vigour contributing to the competitive dominance of weedy rice over cultivated rice.

Pest Manag Sci. 2017-1-5

[10]
Improving U-net network for semantic segmentation of corns and weeds during corn seedling stage in field.

Front Plant Sci. 2024-2-9

引用本文的文献

[1]
Detection of weeds in teff crops using deep learning and UAV imagery for precision herbicide application.

Sci Rep. 2025-8-21

[2]
Deep learning and hyperspectral features for seedling stage identification of barnyard grass in paddy field.

Front Plant Sci. 2025-2-7

[3]
An improved U-net and attention mechanism-based model for sugar beet and weed segmentation.

Front Plant Sci. 2025-1-13

[4]
RiGaD: An aerial dataset of rice seedlings for assessing germination rates and density.

Data Brief. 2024-11-6

[5]
Memory-Augmented 3D Point Cloud Semantic Segmentation Network for Intelligent Mining Shovels.

Sensors (Basel). 2024-7-5

[6]
Crop detection technologies, mechanical weeding executive parts and working performance of intelligent mechanical weeding: a review.

Front Plant Sci. 2024-3-14

[7]
GWAS supported by computer vision identifies large numbers of candidate regulators of in planta regeneration in Populus trichocarpa.

G3 (Bethesda). 2024-4-3

[8]
Attention-aided lightweight networks friendly to smart weeding robot hardware resources for crops and weeds semantic segmentation.

Front Plant Sci. 2023-12-21

[9]
WRA-Net: Wide Receptive Field Attention Network for Motion Deblurring in Crop and Weed Image.

Plant Phenomics. 2023-4-5

[10]
A real-time smart sensing system for automatic localization and recognition of vegetable plants for weed control.

Front Plant Sci. 2023-3-27

本文引用的文献

[1]
Accurate Weed Mapping and Prescription Map Generation Based on Fully Convolutional Networks Using UAV Imagery.

Sensors (Basel). 2018-10-1

[2]
A Semantic Labeling Approach for Accurate Weed Mapping of High Resolution UAV Imagery.

Sensors (Basel). 2018-7-1

[3]
A fully convolutional network for weed mapping of unmanned aerial vehicle (UAV) imagery.

PLoS One. 2018-4-26

[4]
DeepLab: Semantic Image Segmentation with Deep Convolutional Nets, Atrous Convolution, and Fully Connected CRFs.

IEEE Trans Pattern Anal Mach Intell. 2017-4-27

[5]
SegNet: A Deep Convolutional Encoder-Decoder Architecture for Image Segmentation.

IEEE Trans Pattern Anal Mach Intell. 2017-1-2

[6]
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks.

IEEE Trans Pattern Anal Mach Intell. 2016-6-6

[7]
Fully Convolutional Networks for Semantic Segmentation.

IEEE Trans Pattern Anal Mach Intell. 2016-5-24

[8]
Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping.

Sensors (Basel). 2015-8-12

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

推荐工具

医学文档翻译智能文献检索