Suppr超能文献

基于时变卷积网络的车载边缘网络中的混合协作缓存

Hybrid Cooperative Cache Based on Temporal Convolutional Networks in Vehicular Edge Network.

机构信息

School of Information Engineering, Henan University of Science and Technology, Luoyang 471000, China.

出版信息

Sensors (Basel). 2023 May 10;23(10):4619. doi: 10.3390/s23104619.

Abstract

With the continuous development of intelligent vehicles, people's demand for services has also rapidly increased, leading to a sharp increase in wireless network traffic. Edge caching, due to its location advantage, can provide more efficient transmission services and become an effective method to solve the above problems. However, the current mainstream caching solutions only consider content popularity to formulate caching strategies, which can easily lead to cache redundancy between edge nodes and lead to low caching efficiency. To solve these problems, we propose a hybrid content value collaborative caching strategy based on temporal convolutional network (called THCS), which achieves mutual collaboration between different edge nodes under limited cache resources, thereby optimizing cache content and reducing content delivery latency. Specifically, the strategy first obtains accurate content popularity through temporal convolutional network (TCN), then comprehensively considers various factors to measure the hybrid content value (HCV) of cached content, and finally uses a dynamic programming algorithm to maximize the overall HCV and make optimal cache decisions. We have obtained the following conclusion through simulation experiments: compared with the benchmark scheme, THCS has improved the cache hit rate by 12.3% and reduced the content transmission delay by 16.7%.

摘要

随着智能车辆的不断发展,人们对服务的需求也迅速增加,导致无线网络流量急剧增加。由于边缘缓存具有位置优势,因此可以提供更高效的传输服务,成为解决上述问题的有效方法。然而,目前主流的缓存解决方案仅考虑内容流行度来制定缓存策略,这容易导致边缘节点之间的缓存冗余,从而导致缓存效率低下。为了解决这些问题,我们提出了一种基于时间卷积网络的混合内容值协同缓存策略(称为 THCS),该策略在有限的缓存资源下实现了不同边缘节点之间的相互协作,从而优化了缓存内容,降低了内容传输延迟。具体来说,该策略首先通过时间卷积网络(TCN)获得准确的内容流行度,然后综合考虑各种因素来衡量缓存内容的混合内容值(HCV),最后使用动态规划算法最大化整体 HCV 并做出最佳缓存决策。通过仿真实验,我们得出以下结论:与基准方案相比,THCS 提高了 12.3%的缓存命中率,并降低了 16.7%的内容传输延迟。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d1b0/10221780/5f3e14cfbd7d/sensors-23-04619-g001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验