• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于多条件生成对抗网络的时变生成图像的数据驱动作物生长模拟。

Data-driven crop growth simulation on time-varying generated images using multi-conditional generative adversarial networks.

作者信息

Drees Lukas, Demie Dereje T, Paul Madhuri R, Leonhardt Johannes, Seidel Sabine J, Döring Thomas F, Roscher Ribana

机构信息

Remote Sensing Group, Institute of Geodesy and Geoinformation, University of Bonn, Niebuhrstr. 1a, Bonn, 53113, Germany.

Crop Science Group, Institute of Crop Science and Resource Conservation, University of Bonn, Katzenburgweg 5, Bonn, 53115, Germany.

出版信息

Plant Methods. 2024 Jun 15;20(1):93. doi: 10.1186/s13007-024-01205-3.

DOI:10.1186/s13007-024-01205-3
PMID:38879522
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11179353/
Abstract

BACKGROUND

Image-based crop growth modeling can substantially contribute to precision agriculture by revealing spatial crop development over time, which allows an early and location-specific estimation of relevant future plant traits, such as leaf area or biomass. A prerequisite for realistic and sharp crop image generation is the integration of multiple growth-influencing conditions in a model, such as an image of an initial growth stage, the associated growth time, and further information about the field treatment. While image-based models provide more flexibility for crop growth modeling than process-based models, there is still a significant research gap in the comprehensive integration of various growth-influencing conditions. Further exploration and investigation are needed to address this gap.

METHODS

We present a two-stage framework consisting first of an image generation model and second of a growth estimation model, independently trained. The image generation model is a conditional Wasserstein generative adversarial network (CWGAN). In the generator of this model, conditional batch normalization (CBN) is used to integrate conditions of different types along with the input image. This allows the model to generate time-varying artificial images dependent on multiple influencing factors. These images are used by the second part of the framework for plant phenotyping by deriving plant-specific traits and comparing them with those of non-artificial (real) reference images. In addition, image quality is evaluated using multi-scale structural similarity (MS-SSIM), learned perceptual image patch similarity (LPIPS), and Fréchet inception distance (FID). During inference, the framework allows image generation for any combination of conditions used in training; we call this generation data-driven crop growth simulation.

RESULTS

Experiments are performed on three datasets of different complexity. These datasets include the laboratory plant Arabidopsis thaliana (Arabidopsis) and crops grown under real field conditions, namely cauliflower (GrowliFlower) and crop mixtures consisting of faba bean and spring wheat (MixedCrop). In all cases, the framework allows realistic, sharp image generations with a slight loss of quality from short-term to long-term predictions. For MixedCrop grown under varying treatments (different cultivars, sowing densities), the results show that adding these treatment information increases the generation quality and phenotyping accuracy measured by the estimated biomass. Simulations of varying growth-influencing conditions performed with the trained framework provide valuable insights into how such factors relate to crop appearances, which is particularly useful in complex, less explored crop mixture systems. Further results show that adding process-based simulated biomass as a condition increases the accuracy of the derived phenotypic traits from the predicted images. This demonstrates the potential of our framework to serve as an interface between a data-driven and a process-based crop growth model.

CONCLUSION

The realistic generation and simulation  of future plant appearances is adequately feasible by multi-conditional CWGAN. The presented framework complements process-based models and overcomes their limitations, such as the reliance on assumptions and the low exact field-localization specificity, by realistic visualizations of the spatial crop development that directly lead to a high explainability of the model predictions.

摘要

背景

基于图像的作物生长建模通过揭示作物随时间的空间发育情况,可为精准农业做出重大贡献,这使得能够对未来相关植物性状,如叶面积或生物量进行早期且特定位置的估计。生成逼真且清晰的作物图像的一个先决条件是在模型中整合多种影响生长的条件,例如初始生长阶段的图像、相关的生长时间以及有关田间处理的进一步信息。虽然基于图像的模型在作物生长建模方面比基于过程的模型提供了更大的灵活性,但在全面整合各种影响生长的条件方面仍存在重大研究差距。需要进一步探索和研究来填补这一差距。

方法

我们提出了一个两阶段框架,首先是图像生成模型,其次是独立训练的生长估计模型。图像生成模型是一个条件瓦瑟斯坦生成对抗网络(CWGAN)。在该模型的生成器中,条件批归一化(CBN)用于将不同类型的条件与输入图像整合在一起。这使得模型能够生成依赖于多个影响因素的随时间变化的人工图像。框架的第二部分使用这些图像进行植物表型分析,通过推导植物特定性状并将其与非人工(真实)参考图像的性状进行比较。此外,使用多尺度结构相似性(MS - SSIM)、学习感知图像块相似性(LPIPS)和弗雷歇初始距离(FID)来评估图像质量。在推理过程中,该框架允许针对训练中使用的任何条件组合生成图像;我们将这种生成称为数据驱动的作物生长模拟。

结果

在三个不同复杂度的数据集上进行了实验。这些数据集包括实验室植物拟南芥以及在实际田间条件下种植的作物,即花椰菜(GrowliFlower)和由蚕豆和春小麦组成的作物混合物(MixedCrop)。在所有情况下,该框架都能生成逼真、清晰的图像,从短期到长期预测质量略有下降。对于在不同处理(不同品种、播种密度)下生长的MixedCrop,结果表明添加这些处理信息可提高通过估计生物量衡量的生成质量和表型分析准确性。使用训练好的框架对不同影响生长条件进行模拟,为这些因素与作物外观之间的关系提供了有价值的见解,这在复杂、研究较少的作物混合系统中特别有用。进一步的结果表明,添加基于过程模拟的生物量作为一个条件可提高从预测图像中推导的表型性状的准确性。这证明了我们的框架作为数据驱动和基于过程的作物生长模型之间接口的潜力。

结论

通过多条件CWGAN对未来植物外观进行逼真的生成和模拟是完全可行的。所提出的框架补充了基于过程的模型,并通过对作物空间发育的逼真可视化克服了它们的局限性,如依赖假设和低精确田间定位特异性,这直接导致模型预测具有高可解释性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/cdee2d6f382a/13007_2024_1205_Fig17_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/3dd6c04db080/13007_2024_1205_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/52c6faecf880/13007_2024_1205_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/0cd6c769af13/13007_2024_1205_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/5af5b31088f4/13007_2024_1205_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/ae9cb12b1181/13007_2024_1205_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/d92dc2c8065e/13007_2024_1205_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/c12753296533/13007_2024_1205_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/18f789dda697/13007_2024_1205_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/7c00dee66036/13007_2024_1205_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/26a653d0a065/13007_2024_1205_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/4c38135bf3fd/13007_2024_1205_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/f478a9f1ffe4/13007_2024_1205_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/b2bfe42c25b1/13007_2024_1205_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/b120e167dcc3/13007_2024_1205_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/58165d721e1f/13007_2024_1205_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/9c51599ce2c1/13007_2024_1205_Fig16_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/cdee2d6f382a/13007_2024_1205_Fig17_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/3dd6c04db080/13007_2024_1205_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/52c6faecf880/13007_2024_1205_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/0cd6c769af13/13007_2024_1205_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/5af5b31088f4/13007_2024_1205_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/ae9cb12b1181/13007_2024_1205_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/d92dc2c8065e/13007_2024_1205_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/c12753296533/13007_2024_1205_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/18f789dda697/13007_2024_1205_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/7c00dee66036/13007_2024_1205_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/26a653d0a065/13007_2024_1205_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/4c38135bf3fd/13007_2024_1205_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/f478a9f1ffe4/13007_2024_1205_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/b2bfe42c25b1/13007_2024_1205_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/b120e167dcc3/13007_2024_1205_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/58165d721e1f/13007_2024_1205_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/9c51599ce2c1/13007_2024_1205_Fig16_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/61cd/11179353/cdee2d6f382a/13007_2024_1205_Fig17_HTML.jpg

相似文献

1
Data-driven crop growth simulation on time-varying generated images using multi-conditional generative adversarial networks.基于多条件生成对抗网络的时变生成图像的数据驱动作物生长模拟。
Plant Methods. 2024 Jun 15;20(1):93. doi: 10.1186/s13007-024-01205-3.
2
Generative artificial intelligence to produce high-fidelity blastocyst-stage embryo images.生成式人工智能生成高保真囊胚期胚胎图像。
Hum Reprod. 2024 Jun 3;39(6):1197-1207. doi: 10.1093/humrep/deae064.
3
The role of unpaired image-to-image translation for stain color normalization in colorectal cancer histology classification.非配对图像到图像翻译在结直肠癌组织学分类中用于染色颜色归一化的作用。
Comput Methods Programs Biomed. 2023 Jun;234:107511. doi: 10.1016/j.cmpb.2023.107511. Epub 2023 Mar 26.
4
CropPainter: an effective and precise tool for trait-to-image crop visualization based on generative adversarial networks.作物绘图师:一种基于生成对抗网络的用于性状到图像作物可视化的有效且精确的工具。
Plant Methods. 2022 Dec 15;18(1):138. doi: 10.1186/s13007-022-00970-3.
5
A pavement crack synthesis method based on conditional generative adversarial networks.一种基于条件生成对抗网络的路面裂缝合成方法。
Math Biosci Eng. 2024 Jan;21(1):903-923. doi: 10.3934/mbe.2024038. Epub 2022 Dec 21.
6
A material decomposition method for dual-energy CT via dual interactive Wasserstein generative adversarial networks.基于双交互 Wasserstein 生成对抗网络的双能 CT 物质分解方法。
Med Phys. 2021 Jun;48(6):2891-2905. doi: 10.1002/mp.14828. Epub 2021 May 5.
7
High-fidelity direct contrast synthesis from magnetic resonance fingerprinting.基于磁共振指纹成像的高保真直接对比合成。
Magn Reson Med. 2023 Nov;90(5):2116-2129. doi: 10.1002/mrm.29766. Epub 2023 Jun 18.
8
Lesion-aware generative adversarial networks for color fundus image to fundus fluorescein angiography translation.用于彩色眼底图像到眼底荧光血管造影转换的病变感知生成对抗网络。
Comput Methods Programs Biomed. 2023 Feb;229:107306. doi: 10.1016/j.cmpb.2022.107306. Epub 2022 Dec 14.
9
Use of synthetic images for training a deep learning model for weed detection and biomass estimation in cotton.利用合成图像训练棉花杂草检测和生物量估计的深度学习模型。
Sci Rep. 2022 Nov 15;12(1):19580. doi: 10.1038/s41598-022-23399-z.
10
High-content image generation for drug discovery using generative adversarial networks.基于生成对抗网络的药物发现高内涵图像生成。
Neural Netw. 2020 Dec;132:353-363. doi: 10.1016/j.neunet.2020.09.007. Epub 2020 Sep 20.

引用本文的文献

1
GAN-based image prediction of maize growth across varieties and developmental stages.基于生成对抗网络的玉米不同品种和发育阶段生长图像预测
Plant Methods. 2025 Aug 11;21(1):110. doi: 10.1186/s13007-025-01430-4.

本文引用的文献

1
Mixture × Genotype Effects in Cereal/Legume Intercropping.谷物/豆类间作中的混合物×基因型效应
Front Plant Sci. 2022 Apr 1;13:846720. doi: 10.3389/fpls.2022.846720. eCollection 2022.
2
Behind the Leaves: Estimation of Occluded Grapevine Berries With Conditional Generative Adversarial Networks.叶片背后:利用条件生成对抗网络估计被遮挡的葡萄浆果
Front Artif Intell. 2022 Mar 25;5:830026. doi: 10.3389/frai.2022.830026. eCollection 2022.
3
A Style-Based Generator Architecture for Generative Adversarial Networks.基于风格的生成对抗网络生成器架构。
IEEE Trans Pattern Anal Mach Intell. 2021 Dec;43(12):4217-4228. doi: 10.1109/TPAMI.2020.2970919. Epub 2021 Nov 3.
4
Machine Learning for Plant Phenotyping Needs Image Processing.用于植物表型分析的机器学习需要图像处理技术。
Trends Plant Sci. 2016 Dec;21(12):989-991. doi: 10.1016/j.tplants.2016.10.002. Epub 2016 Oct 31.
5
Diversity enhances agricultural productivity via rhizosphere phosphorus facilitation on phosphorus-deficient soils.在缺磷土壤上,多样性通过根际磷促进作用提高农业生产力。
Proc Natl Acad Sci U S A. 2007 Jul 3;104(27):11192-6. doi: 10.1073/pnas.0704591104. Epub 2007 Jun 25.