Department of Computer Science and Engineering, Kyung Hee University, Yongin-si, Gyeonggi-do, Rep. of Korea.
PLoS One. 2019 Aug 13;14(8):e0220813. doi: 10.1371/journal.pone.0220813. eCollection 2019.
Over the last few decades, the Internet has experienced tremendous growth in data traffic. This continuous growth due to the increase in the number of connected devices and platforms has dramatically boosted content consumption. However, retrieving content from the servers of Content Providers (CPs) can increase network traffic and incur high network delay and congestion. To address these challenges, we propose a joint deep learning and auction-based approach for congestion-aware caching in Named Data Networking (NDN), which aims to prevent congestion and minimize the content downloading delays. First, using recorded network traffic data on the Internet Service Provider (ISP) network, we propose a deep learning model to predict future traffic over transit links. Second, to prevent congestion and avoid high latency on transit links, which may experience congestion in the future; we propose a caching model that helps the ISP to cache content that has a high predicted future demand. Paid-content requires payment to be downloaded and cached. Therefore, we propose an auction mechanism to obtain paid-content at an optimal price. The simulation results show that our proposal prevents congestion and increases the profits of both ISPs and CPs.
在过去的几十年中,互联网的数据流量经历了巨大的增长。由于连接设备和平台数量的增加,这种持续增长极大地促进了内容消费。然而,从内容提供商 (CP) 的服务器中检索内容会增加网络流量,并导致高网络延迟和拥塞。为了解决这些挑战,我们提出了一种基于联合深度学习和拍卖的方法,用于命名数据网络 (NDN) 中的拥塞感知缓存,旨在防止拥塞并最小化内容下载延迟。首先,我们使用互联网服务提供商 (ISP) 网络上记录的网络流量数据,提出了一种深度学习模型来预测转发链路的未来流量。其次,为了防止拥塞并避免转发链路的高延迟,这些链路可能会在未来出现拥塞;我们提出了一种缓存模型,帮助 ISP 缓存具有高预测未来需求的内容。付费内容需要支付才能下载和缓存。因此,我们提出了一种拍卖机制,以最优价格获取付费内容。仿真结果表明,我们的方案可以防止拥塞并提高 ISP 和 CP 的利润。