• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用深度卷积生成对抗网络的服装拼接缺陷图像合成

Image synthesis of apparel stitching defects using deep convolutional generative adversarial networks.

作者信息

Ul-Huda Noor, Ahmad Haseeb, Banjar Ameen, Alzahrani Ahmed Omar, Ahmad Ibrar, Naeem M Salman

机构信息

Department of Computer Science, National Textile University, Faisalabad, Pakistan.

College of Computer Science and Engineering, University of Jeddah, 21959, Jeddah, Saudi Arabia.

出版信息

Heliyon. 2024 Feb 15;10(4):e26466. doi: 10.1016/j.heliyon.2024.e26466. eCollection 2024 Feb 29.

DOI:10.1016/j.heliyon.2024.e26466
PMID:38420437
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10900799/
Abstract

In industrial manufacturing, the detection of stitching defects in fabric has become a pivotal stage in ensuring product quality. Deep learning-based fabric defect detection models have demonstrated remarkable accuracy, but they often require a vast amount of training data. Unfortunately, practical production lines typically lack a sufficient quantity of apparel stitching defect images due to limited research-industry collaboration and privacy concerns. To address this challenge, this study introduces an innovative approach based on DCGAN (Deep Convolutional Generative Adversarial Network), enabling the automatic generation of stitching defects in fabric. The evaluation encompasses both quantitative and qualitative assessments, supported by extensive comparative experiments. For validation of results, ten industrial experts marked 80% accuracy of the generated images. Moreover, Fréchet Inception Distance also inferred promising results. The outcomes, marked by high accuracy rate, underscore the effectiveness of proposed defect generation model. It demonstrates the ability to produce realistic stitching defective data, bridging the gap caused by data scarcity in practical industrial settings.

摘要

在工业制造中,织物拼接缺陷的检测已成为确保产品质量的关键环节。基于深度学习的织物缺陷检测模型已展现出卓越的准确性,但它们通常需要大量的训练数据。不幸的是,由于研究与行业合作有限以及隐私问题,实际生产线通常缺乏足够数量的服装拼接缺陷图像。为应对这一挑战,本研究引入了一种基于深度卷积生成对抗网络(DCGAN)的创新方法,能够自动生成织物中的拼接缺陷。评估包括定量和定性评估,并辅以广泛的对比实验。为验证结果,十位行业专家对生成图像的准确率给出了80%的评价。此外,弗雷歇因ception距离也得出了有前景的结果。以高准确率为标志的结果突出了所提出的缺陷生成模型的有效性。它展示了生成逼真的拼接缺陷数据的能力,弥合了实际工业环境中数据稀缺所造成的差距。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/bacb344ae35f/gr23.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/90157545b4c4/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/208d0eaf7cc3/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/552810cf96f9/gr3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/b4cdec5b0ce4/gr4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/d287c61393e7/gr5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/50942161faae/gr6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/3c8bfc3775fb/gr7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/05ddcf606369/gr8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/3a75bb881995/gr9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/fc092a2f0314/gr10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/937d62825deb/gr11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/a9581e97c968/gr12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/2d0cc652c2cd/gr13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/a72bc09ab36e/gr14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/c3d5ee794579/gr15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/15b009fbc6e6/gr16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/308fb85dd7e5/gr17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/434c1580636d/gr18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/47b5df75b064/gr19.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/a970552988e5/gr20.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/103534361602/gr21.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/cd031fb0df94/gr22.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/bacb344ae35f/gr23.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/90157545b4c4/gr1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/208d0eaf7cc3/gr2.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/552810cf96f9/gr3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/b4cdec5b0ce4/gr4.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/d287c61393e7/gr5.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/50942161faae/gr6.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/3c8bfc3775fb/gr7.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/05ddcf606369/gr8.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/3a75bb881995/gr9.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/fc092a2f0314/gr10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/937d62825deb/gr11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/a9581e97c968/gr12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/2d0cc652c2cd/gr13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/a72bc09ab36e/gr14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/c3d5ee794579/gr15.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/15b009fbc6e6/gr16.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/308fb85dd7e5/gr17.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/434c1580636d/gr18.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/47b5df75b064/gr19.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/a970552988e5/gr20.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/103534361602/gr21.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/cd031fb0df94/gr22.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4a42/10900799/bacb344ae35f/gr23.jpg

相似文献

1
Image synthesis of apparel stitching defects using deep convolutional generative adversarial networks.使用深度卷积生成对抗网络的服装拼接缺陷图像合成
Heliyon. 2024 Feb 15;10(4):e26466. doi: 10.1016/j.heliyon.2024.e26466. eCollection 2024 Feb 29.
2
Improving synthetic media generation and detection using generative adversarial networks.使用生成对抗网络改进合成媒体生成与检测
PeerJ Comput Sci. 2024 Sep 20;10:e2181. doi: 10.7717/peerj-cs.2181. eCollection 2024.
3
A Comparative Analysis of the Novel Conditional Deep Convolutional Neural Network Model, Using Conditional Deep Convolutional Generative Adversarial Network-Generated Synthetic and Augmented Brain Tumor Datasets for Image Classification.新型条件深度卷积神经网络模型的比较分析,该模型使用条件深度卷积生成对抗网络生成的合成及增强脑肿瘤数据集进行图像分类。
Brain Sci. 2024 May 30;14(6):559. doi: 10.3390/brainsci14060559.
4
Generative adversarial network based synthetic data training model for lightweight convolutional neural networks.用于轻量级卷积神经网络的基于生成对抗网络的合成数据训练模型。
Multimed Tools Appl. 2023 May 20:1-23. doi: 10.1007/s11042-023-15747-6.
5
High-content image generation for drug discovery using generative adversarial networks.基于生成对抗网络的药物发现高内涵图像生成。
Neural Netw. 2020 Dec;132:353-363. doi: 10.1016/j.neunet.2020.09.007. Epub 2020 Sep 20.
6
Skin Lesion Synthesis and Classification Using an Improved DCGAN Classifier.使用改进的深度卷积生成对抗网络分类器进行皮肤病变合成与分类
Diagnostics (Basel). 2023 Aug 9;13(16):2635. doi: 10.3390/diagnostics13162635.
7
Integration of Deep Learning Network and Robot Arm System for Rim Defect Inspection Application.深度学习网络与机械臂系统集成在轮辋缺陷检测中的应用
Sensors (Basel). 2022 May 22;22(10):3927. doi: 10.3390/s22103927.
8
Generating ultrasonic images indistinguishable from real images using Generative Adversarial Networks.使用生成对抗网络生成与真实图像无法区分的超声图像。
Ultrasonics. 2022 Feb;119:106610. doi: 10.1016/j.ultras.2021.106610. Epub 2021 Oct 27.
9
A Novel COVID-19 Detection Model Based on DCGAN and Deep Transfer Learning.一种基于深度卷积生成对抗网络(DCGAN)和深度迁移学习的新型新冠病毒(COVID-19)检测模型。
Procedia Comput Sci. 2022;204:65-72. doi: 10.1016/j.procs.2022.08.008. Epub 2022 Sep 10.
10
DG-GAN: A High Quality Defect Image Generation Method for Defect Detection.DG-GAN:一种用于缺陷检测的高质量缺陷图像生成方法。
Sensors (Basel). 2023 Jun 26;23(13):5922. doi: 10.3390/s23135922.

引用本文的文献

1
Local-Peak Scale-Invariant Feature Transform for Fast and Random Image Stitching.用于快速随机图像拼接的局部峰值尺度不变特征变换
Sensors (Basel). 2024 Sep 4;24(17):5759. doi: 10.3390/s24175759.

本文引用的文献

1
DG-GAN: A High Quality Defect Image Generation Method for Defect Detection.DG-GAN:一种用于缺陷检测的高质量缺陷图像生成方法。
Sensors (Basel). 2023 Jun 26;23(13):5922. doi: 10.3390/s23135922.
2
Multistage GAN for Fabric Defect Detection.用于织物缺陷检测的多阶段生成对抗网络
IEEE Trans Image Process. 2019 Dec 19. doi: 10.1109/TIP.2019.2959741.
3
Effect of augmented datasets on deep convolutional neural networks applied to chest radiographs.增广数据集对应用于胸部 X 光片的深度卷积神经网络的影响。
Clin Radiol. 2019 Sep;74(9):697-701. doi: 10.1016/j.crad.2019.04.025. Epub 2019 Jun 10.
4
3D conditional generative adversarial networks for high-quality PET image estimation at low dose.基于三维条件生成对抗网络的低剂量 PET 图像高质量估计。
Neuroimage. 2018 Jul 1;174:550-562. doi: 10.1016/j.neuroimage.2018.03.045. Epub 2018 Mar 20.
5
Automated Classification of Lung Cancer Types from Cytological Images Using Deep Convolutional Neural Networks.基于深度学习卷积神经网络的细胞病理学肺癌类型自动分类。
Biomed Res Int. 2017;2017:4067832. doi: 10.1155/2017/4067832. Epub 2017 Aug 13.