Suppr超能文献

通过任务卸载和服务缓存实现边缘网络中离线和在线服务的综合服务质量

Integrated Quality of Service for Offline and Online Services in Edge Networks via Task Offloading and Service Caching.

作者信息

Zhan Chuangqiang, Zheng Shaojie, Chen Jingyu, Liang Jiachao, Zhou Xiaojie

机构信息

School of Information Engineering, Guangdong University of Technology, Guangzhou 510006, China.

出版信息

Sensors (Basel). 2024 Jul 18;24(14):4677. doi: 10.3390/s24144677.

Abstract

Edge servers frequently manage their own offline digital twin (DT) services, in addition to caching online digital twin services. However, current research often overlooks the impact of offline caching services on memory and computation resources, which can hinder the efficiency of online service task processing on edge servers. In this study, we concentrated on service caching and task offloading within a collaborative edge computing system by emphasizing the integrated quality of service (QoS) for both online and offline edge services. We considered the resource usage of both online and offline services, along with incoming online requests. To maximize the overall QoS utility, we established an optimization objective that rewards the throughput of online services while penalizing offline services that miss their soft deadlines. We formulated this as a utility maximization problem, which was proven to be NP-hard. To tackle this complexity, we reframed the optimization problem as a Markov decision process (MDP) and introduced a joint optimization algorithm for service caching and task offloading by leveraging the deep Q-network (DQN). Comprehensive experiments revealed that our algorithm enhanced the utility by at least 14.01% compared with the baseline algorithms.

摘要

边缘服务器除了缓存在线数字孪生服务外,还经常管理自己的离线数字孪生(DT)服务。然而,当前的研究往往忽视了离线缓存服务对内存和计算资源的影响,这可能会阻碍边缘服务器上在线服务任务处理的效率。在本研究中,我们通过强调在线和离线边缘服务的综合服务质量(QoS),专注于协作边缘计算系统中的服务缓存和任务卸载。我们考虑了在线和离线服务的资源使用情况以及传入的在线请求。为了最大化整体QoS效用,我们建立了一个优化目标,奖励在线服务的吞吐量,同时惩罚未达到软截止期限的离线服务。我们将此表述为一个效用最大化问题,事实证明该问题是NP难问题。为了解决这种复杂性,我们将优化问题重新构建为马尔可夫决策过程(MDP),并通过利用深度Q网络(DQN)引入了一种用于服务缓存和任务卸载的联合优化算法。全面的实验表明,与基线算法相比,我们的算法将效用提高了至少14.01%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/02ea/11281042/57237b1ad64a/sensors-24-04677-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验