Suppr超能文献

大语言模型孪生体:基于语义安全通信与计算的、由微型巨型模型驱动的超越5G的数字孪生网络框架。

LLM-Twin: mini-giant model-driven beyond 5G digital twin networking framework with semantic secure communication and computation.

作者信息

Hong Yang, Wu Jun, Morello Rosario

机构信息

Graduate School of Information, Production and System, Waseda University, Fukuoka, 8080135, Japan.

Department of Information Engineering, University "Mediterranea" of Reggio Calabria, Via Graziella, 89122, Reggio Calabria, Italy.

出版信息

Sci Rep. 2024 Aug 17;14(1):19065. doi: 10.1038/s41598-024-69474-5.

Abstract

Beyond 5G networks provide solutions for next-generation communications, especially digital twins networks (DTNs) have gained increasing popularity for bridging physical and digital space. However, current DTNs pose some challenges, especially when applied to scenarios that require efficient and multimodal data processing. Firstly, current DTNs are limited in communication and computational efficiency, since they require to transmit large amounts of raw data collected from physical sensors, as well as to ensure model synchronization through high-frequency computation. Second, current models of DTNs are domain-specific (e.g. E-health), making it difficult to handle DT scenarios with multimodal data processing requirements. Finally, current security schemes for DTNs introduce additional overheads that impair the efficiency. Against the above challenges, we propose a large language model (LLM) empowered DTNs framework, LLM-Twin. First, based on LLM, we propose digital twin semantic networks (DTSNs), which enable more efficient communication and computation. Second, we design a mini-giant model collaboration scheme, which enables efficient deployment of LLM in DTNs and is adapted to handle multimodal data. Then, we designed a native security policy for LLM-twin without compromising efficiency. Numerical experiments and case studies demonstrate the feasibility of LLM-Twin. To our knowledge, this is the first to propose an LLM-based semantic-level DTNs.

摘要

超5G网络为下一代通信提供了解决方案,尤其是数字孪生网络(DTN)在弥合物理空间和数字空间方面越来越受欢迎。然而,当前的数字孪生网络面临一些挑战,特别是在应用于需要高效和多模态数据处理的场景时。首先,当前的数字孪生网络在通信和计算效率方面存在限制,因为它们需要传输从物理传感器收集的大量原始数据,并通过高频计算确保模型同步。其次,当前数字孪生网络的模型是特定领域的(例如电子健康),使得难以处理具有多模态数据处理要求的数字孪生场景。最后,当前数字孪生网络的安全方案会引入额外开销,从而损害效率。针对上述挑战,我们提出了一种由大语言模型(LLM)赋能的数字孪生网络框架,即LLM-Twin。首先,基于大语言模型,我们提出了数字孪生语义网络(DTSN),它能实现更高效的通信和计算。其次,我们设计了一种迷你-巨型模型协作方案,该方案能使大语言模型在数字孪生网络中高效部署,并适用于处理多模态数据。然后,我们为LLM-Twin设计了一种不影响效率的原生安全策略。数值实验和案例研究证明了LLM-Twin的可行性。据我们所知,这是首次提出基于大语言模型的语义级数字孪生网络。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bb04/11330471/4e81ceafe09a/41598_2024_69474_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验