• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于稳健蚊子分类的先进视觉变换器与开放集学习:昆虫学研究的新方法。

Advanced vision transformers and open-set learning for robust mosquito classification: A novel approach to entomological studies.

作者信息

Karim Ahmed Akib Jawad, Mahmud Muhammad Zawad, Khan Riasat

机构信息

Electrical and Computer Engineering, North South University, Dhaka, Bangladesh.

出版信息

PLoS Comput Biol. 2024 Dec 13;20(12):e1012654. doi: 10.1371/journal.pcbi.1012654. eCollection 2024 Dec.

DOI:10.1371/journal.pcbi.1012654
PMID:39671336
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11642999/
Abstract

Mosquito-related diseases pose a significant threat to global public health, necessitating efficient and accurate mosquito classification for effective surveillance and control. This work presents an innovative approach to mosquito classification by leveraging state-of-the-art vision transformers and open-set learning techniques. A novel framework has been introduced that integrates Transformer-based deep learning models with comprehensive data augmentation and preprocessing methods, enabling robust and precise identification of ten mosquito species. The Swin Transformer model achieves the best performance for traditional closed-set learning with 99.60% accuracy and 0.996 F1 score. The lightweight MobileViT technique attains an almost equivalent accuracy of 98.90% with significantly reduced parameters and model complexities. Next, the applied deep learning models' adaptability and generalizability in a static environment have been enhanced by using new classes of data samples during the inference stage that have not been included in the training set. The proposed framework's ability to handle unseen classes like insects similar to mosquitoes, even humans, through open-set learning further enhances its practical applicability employing the OpenMax technique and Weibull distribution. The traditional CNN model, Xception, outperforms the latest transformer with higher accuracy and F1 score for open-set learning. The study's findings highlight the transformative potential of advanced deep-learning architectures in entomology, providing a strong groundwork for future research and development in mosquito surveillance and vector control. The implications of this work extend beyond mosquito classification, offering valuable insights for broader ecological and environmental monitoring applications.

摘要

与蚊子相关的疾病对全球公共卫生构成重大威胁,因此需要进行高效准确的蚊子分类,以实现有效的监测和控制。这项工作提出了一种创新方法,通过利用先进的视觉Transformer和开放集学习技术对蚊子进行分类。引入了一个新颖的框架,该框架将基于Transformer的深度学习模型与全面的数据增强和预处理方法相结合,能够对十种蚊子进行强大而精确的识别。Swin Transformer模型在传统的封闭集学习中表现最佳,准确率为99.60%,F1分数为0.996。轻量级的MobileViT技术在参数和模型复杂度显著降低的情况下,达到了几乎相同的98.90%的准确率。接下来,通过在推理阶段使用未包含在训练集中的新类数据样本,增强了应用的深度学习模型在静态环境中的适应性和通用性。所提出的框架通过开放集学习,利用OpenMax技术和威布尔分布,能够处理类似蚊子的昆虫甚至人类等未见类别,进一步提高了其实用适用性。传统的CNN模型Xception在开放集学习中以更高的准确率和F1分数优于最新的Transformer。该研究结果突出了先进深度学习架构在昆虫学中的变革潜力,为蚊子监测和病媒控制的未来研究与开发奠定了坚实基础。这项工作的影响不仅限于蚊子分类,还为更广泛的生态和环境监测应用提供了有价值的见解。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/c16fc33c6d37/pcbi.1012654.g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/a6805f3a3c89/pcbi.1012654.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/c972dbfd79b6/pcbi.1012654.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/2742558a263e/pcbi.1012654.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/ab7ddb943b15/pcbi.1012654.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/c86b04ba1b5e/pcbi.1012654.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/68a7d9c53a57/pcbi.1012654.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/34b52efff4cd/pcbi.1012654.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/9f6c1097be62/pcbi.1012654.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/b613485a50ae/pcbi.1012654.g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/22948c45404b/pcbi.1012654.g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/97db5ab56a4b/pcbi.1012654.g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/a95767e36a69/pcbi.1012654.g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/5dc1d49676d5/pcbi.1012654.g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/bb92d3e37d68/pcbi.1012654.g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/c16fc33c6d37/pcbi.1012654.g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/a6805f3a3c89/pcbi.1012654.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/c972dbfd79b6/pcbi.1012654.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/2742558a263e/pcbi.1012654.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/ab7ddb943b15/pcbi.1012654.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/c86b04ba1b5e/pcbi.1012654.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/68a7d9c53a57/pcbi.1012654.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/34b52efff4cd/pcbi.1012654.g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/9f6c1097be62/pcbi.1012654.g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/b613485a50ae/pcbi.1012654.g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/22948c45404b/pcbi.1012654.g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/97db5ab56a4b/pcbi.1012654.g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/a95767e36a69/pcbi.1012654.g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/5dc1d49676d5/pcbi.1012654.g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/bb92d3e37d68/pcbi.1012654.g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/df67/11642999/c16fc33c6d37/pcbi.1012654.g015.jpg

相似文献

1
Advanced vision transformers and open-set learning for robust mosquito classification: A novel approach to entomological studies.用于稳健蚊子分类的先进视觉变换器与开放集学习:昆虫学研究的新方法。
PLoS Comput Biol. 2024 Dec 13;20(12):e1012654. doi: 10.1371/journal.pcbi.1012654. eCollection 2024 Dec.
2
Enhanced Pneumonia Detection in Chest X-Rays Using Hybrid Convolutional and Vision Transformer Networks.使用混合卷积和视觉Transformer网络增强胸部X光片中的肺炎检测
Curr Med Imaging. 2025;21:e15734056326685. doi: 10.2174/0115734056326685250101113959.
3
Robust mosquito species identification from diverse body and wing images using deep learning.利用深度学习从不同的身体和翅膀图像中进行稳健的蚊子种类识别。
Parasit Vectors. 2024 Sep 2;17(1):372. doi: 10.1186/s13071-024-06459-3.
4
Mosquito species identification using convolutional neural networks with a multitiered ensemble model for novel species detection.使用具有多层次集成模型的卷积神经网络进行蚊子物种识别,以检测新型物种。
Sci Rep. 2021 Jul 1;11(1):13656. doi: 10.1038/s41598-021-92891-9.
5
A Swin Transformer-based model for mosquito species identification.基于 Swin Transformer 的蚊虫种类识别模型。
Sci Rep. 2022 Nov 4;12(1):18664. doi: 10.1038/s41598-022-21017-6.
6
Enhance fashion classification of mosquito vector species via self-supervised vision transformer.通过自监督视觉变换器增强蚊媒物种的分类
Sci Rep. 2024 Dec 28;14(1):31517. doi: 10.1038/s41598-024-83358-8.
7
MosquitoSong+: A noise-robust deep learning model for mosquito classification from wingbeat sounds.蚊音+: 一种基于深度学习的抗噪模型,可用于从翅膀拍打声中对蚊子进行分类。
PLoS One. 2024 Oct 30;19(10):e0310121. doi: 10.1371/journal.pone.0310121. eCollection 2024.
8
Classification and Morphological Analysis of Vector Mosquitoes using Deep Convolutional Neural Networks.利用深度卷积神经网络对病媒蚊进行分类和形态分析。
Sci Rep. 2020 Jan 23;10(1):1012. doi: 10.1038/s41598-020-57875-1.
9
Do it the transformer way: A comprehensive review of brain and vision transformers for autism spectrum disorder diagnosis and classification.采用变压器方法:自闭症谱系障碍诊断和分类的脑和视觉变压器的全面综述。
Comput Biol Med. 2023 Dec;167:107667. doi: 10.1016/j.compbiomed.2023.107667. Epub 2023 Nov 3.
10
Swin-GA-RF: genetic algorithm-based Swin Transformer and random forest for enhancing cervical cancer classification.Swin-GA-RF:基于遗传算法的Swin Transformer和随机森林用于增强宫颈癌分类
Front Oncol. 2024 Jul 19;14:1392301. doi: 10.3389/fonc.2024.1392301. eCollection 2024.

本文引用的文献

1
Rapid and non-destructive identification of Anopheles gambiae and Anopheles arabiensis mosquito species using Raman spectroscopy via machine learning classification models.利用拉曼光谱和机器学习分类模型快速无损鉴别冈比亚按蚊和阿拉伯按蚊。
Malar J. 2023 Nov 8;22(1):342. doi: 10.1186/s12936-023-04777-y.
2
Dengue overview: An updated systemic review.登革热概述:更新的系统综述。
J Infect Public Health. 2023 Oct;16(10):1625-1642. doi: 10.1016/j.jiph.2023.08.001. Epub 2023 Aug 3.
3
Deep Learning-Based Image Classification for Major Mosquito Species Inhabiting Korea.
基于深度学习的韩国主要栖息蚊种图像分类
Insects. 2023 Jun 5;14(6):526. doi: 10.3390/insects14060526.
4
Deep Classification with Linearity-Enhanced Logits to Softmax Function.通过线性增强对数几率到Softmax函数的深度分类
Entropy (Basel). 2023 Apr 27;25(5):727. doi: 10.3390/e25050727.
5
Recent outbreak of dengue in Bangladesh: A threat to public health.孟加拉国近期登革热疫情:对公众健康的威胁。
Health Sci Rep. 2023 Apr 11;6(4):e1210. doi: 10.1002/hsr2.1210. eCollection 2023 Apr.
6
Mosquito-Borne Diseases and Their Control Strategies: An Overview Focused on Green Synthesized Plant-Based Metallic Nanoparticles.蚊媒疾病及其控制策略:聚焦绿色合成植物基金属纳米粒子的概述
Insects. 2023 Feb 23;14(3):221. doi: 10.3390/insects14030221.
7
A protocol for developing a classification system of mosquitoes using transfer learning.一种使用迁移学习开发蚊子分类系统的方案。
MethodsX. 2022 Nov 28;10:101947. doi: 10.1016/j.mex.2022.101947. eCollection 2023.
8
A Swin Transformer-based model for mosquito species identification.基于 Swin Transformer 的蚊虫种类识别模型。
Sci Rep. 2022 Nov 4;12(1):18664. doi: 10.1038/s41598-022-21017-6.
9
Dataset of vector mosquito images.媒介蚊虫图像数据集。
Data Brief. 2022 Sep 7;45:108573. doi: 10.1016/j.dib.2022.108573. eCollection 2022 Dec.
10
Vector mosquito image classification using novel RIFS feature selection and machine learning models for disease epidemiology.利用新型RIFS特征选择和机器学习模型进行病媒蚊子图像分类以用于疾病流行病学研究
Saudi J Biol Sci. 2022 Jan;29(1):583-594. doi: 10.1016/j.sjbs.2021.09.021. Epub 2021 Sep 20.