• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

面向感兴趣区域的服务缓存辅助边缘蜂窝网络中的卸载和资源分配优化。

AoI-Aware Optimization of Service Caching-Assisted Offloading and Resource Allocation in Edge Cellular Networks.

机构信息

The Guangdong Key Laboratory of Information Security Technology, School of Computer Science and Engineering, Sun Yat-Sen University, Guangzhou 510006, China.

出版信息

Sensors (Basel). 2023 Mar 21;23(6):3306. doi: 10.3390/s23063306.

DOI:10.3390/s23063306
PMID:36992017
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10056929/
Abstract

The rapid development of the Internet of Things (IoT) has led to computational offloading at the edge; this is a promising paradigm for achieving intelligence everywhere. As offloading can lead to more traffic in cellular networks, cache technology is used to alleviate the channel burden. For example, a deep neural network (DNN)-based inference task requires a computation service that involves running libraries and parameters. Thus, caching the service package is necessary for repeatedly running DNN-based inference tasks. On the other hand, as the DNN parameters are usually trained in distribution, IoT devices need to fetch up-to-date parameters for inference task execution. In this work, we consider the joint optimization of computation offloading, service caching, and the AoI metric. We formulate a problem to minimize the weighted sum of the average completion delay, energy consumption, and allocated bandwidth. Then, we propose the AoI-aware service caching-assisted offloading framework (ASCO) to solve it, which consists of the method of Lagrange multipliers with the KKT condition-based offloading module (LMKO), the Lyapunov optimization-based learning and update control module (LLUC), and the Kuhn-Munkres (KM) algorithm-based channel-division fetching module (KCDF). The simulation results demonstrate that our ASCO framework achieves superior performance in regard to time overhead, energy consumption, and allocated bandwidth. It is verified that our ASCO framework not only benefits the individual task but also the global bandwidth allocation.

摘要

物联网(IoT)的快速发展使得边缘计算中的计算卸载成为可能;这是实现无处不在的智能的一种很有前途的范例。由于卸载可能会导致蜂窝网络中的更多流量,因此使用缓存技术来缓解信道负担。例如,基于深度神经网络(DNN)的推理任务需要涉及运行库和参数的计算服务。因此,缓存服务包对于重复运行基于 DNN 的推理任务是必要的。另一方面,由于 DNN 参数通常在分布中进行训练,因此 IoT 设备需要获取最新的参数以执行推理任务。在这项工作中,我们考虑了计算卸载、服务缓存和 AoI 指标的联合优化。我们提出了一个问题,以最小化平均完成延迟、能耗和分配带宽的加权和。然后,我们提出了基于 AoI 感知的服务缓存辅助卸载框架(ASCO)来解决这个问题,它由基于拉格朗日乘子法和 KKT 条件的卸载模块(LMKO)、基于 Lyapunov 优化的学习和更新控制模块(LLUC)和基于 Kuhn-Munkres(KM)算法的信道划分获取模块(KCDF)组成。仿真结果表明,我们的 ASCO 框架在时间开销、能耗和分配带宽方面都具有优越的性能。验证了我们的 ASCO 框架不仅有利于单个任务,而且有利于全局带宽分配。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/e11e1a17b3db/sensors-23-03306-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/3ba014489098/sensors-23-03306-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/f0334f17dcf5/sensors-23-03306-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/5d0bda89202c/sensors-23-03306-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/a56a2fb04513/sensors-23-03306-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/e11e1a17b3db/sensors-23-03306-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/3ba014489098/sensors-23-03306-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/f0334f17dcf5/sensors-23-03306-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/5d0bda89202c/sensors-23-03306-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/a56a2fb04513/sensors-23-03306-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/26d7/10056929/e11e1a17b3db/sensors-23-03306-g005.jpg

相似文献

1
AoI-Aware Optimization of Service Caching-Assisted Offloading and Resource Allocation in Edge Cellular Networks.面向感兴趣区域的服务缓存辅助边缘蜂窝网络中的卸载和资源分配优化。
Sensors (Basel). 2023 Mar 21;23(6):3306. doi: 10.3390/s23063306.
2
Collaborative Task Offloading and Service Caching Strategy for Mobile Edge Computing.面向移动边缘计算的协同任务卸载和服务缓存策略。
Sensors (Basel). 2022 Sep 7;22(18):6760. doi: 10.3390/s22186760.
3
Computational Offloading in Mobile Edge with Comprehensive and Energy Efficient Cost Function: A Deep Learning Approach.基于综合节能成本函数的移动边缘计算卸载:一种深度学习方法
Sensors (Basel). 2021 May 19;21(10):3523. doi: 10.3390/s21103523.
4
Computing Offloading Based on TD3 Algorithm in Cache-Assisted Vehicular NOMA-MEC Networks.缓存辅助车载NOMA-MEC网络中基于TD3算法的计算卸载
Sensors (Basel). 2023 Nov 9;23(22):9064. doi: 10.3390/s23229064.
5
Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.基于云边计算的物联网传感器的能量感知计算卸载。
Sensors (Basel). 2018 Jun 15;18(6):1945. doi: 10.3390/s18061945.
6
Contract-Optimization Approach (COA): A New Approach for Optimizing Service Caching, Computation Offloading, and Resource Allocation in Mobile Edge Computing Network.合同优化方法(COA):一种用于优化移动边缘计算网络中的服务缓存、计算卸载和资源分配的新方法。
Sensors (Basel). 2023 May 16;23(10):4806. doi: 10.3390/s23104806.
7
Federated Deep Reinforcement Learning-Based Task Offloading and Resource Allocation for Smart Cities in a Mobile Edge Network.移动边缘网络中基于联邦深度强化学习的智慧城市任务卸载与资源分配
Sensors (Basel). 2022 Jun 23;22(13):4738. doi: 10.3390/s22134738.
8
Deep Reinforcement Learning for Computation Offloading and Resource Allocation in Unmanned-Aerial-Vehicle Assisted Edge Computing.无人机辅助边缘计算中用于计算卸载和资源分配的深度强化学习
Sensors (Basel). 2021 Sep 29;21(19):6499. doi: 10.3390/s21196499.
9
Multi-Task Partial Offloading with Relay and Adaptive Bandwidth Allocation for the MEC-Assisted IoT.多任务部分卸载与中继和自适应带宽分配的 MEC 辅助物联网。
Sensors (Basel). 2022 Dec 24;23(1):190. doi: 10.3390/s23010190.
10
Advanced Deep Learning for Resource Allocation and Security Aware Data Offloading in Industrial Mobile Edge Computing.工业移动边缘计算中的资源分配和安全感知数据卸载的高级深度学习。
Big Data. 2021 Aug;9(4):265-278. doi: 10.1089/big.2020.0284. Epub 2021 Mar 2.