Kim Won-Suk
Department of Multimedia Engineering, Andong National University, Andong 36729, Korea.
Entropy (Basel). 2021 Apr 26;23(5):532. doi: 10.3390/e23050532.
Edge computing can deliver network services with low latency and real-time processing by providing cloud services at the network edge. Edge computing has a number of advantages such as low latency, locality, and network traffic distribution, but the associated resource management has become a significant challenge because of its inherent hierarchical, distributed, and heterogeneous nature. Various cloud-based network services such as crowd sensing, hierarchical deep learning systems, and cloud gaming each have their own traffic patterns and computing requirements. To provide a satisfactory user experience for these services, resource management that comprehensively considers service diversity, client usage patterns, and network performance indicators is required. In this study, an algorithm that simultaneously considers computing resources and network traffic load when deploying servers that provide edge services is proposed. The proposed algorithm generates candidate deployments based on factors that affect traffic load, such as the number of servers, server location, and client mapping according to service characteristics and usage. A final deployment plan is then established using a partial vector bin packing scheme that considers both the generated traffic and computing resources in the network. The proposed algorithm is evaluated using several simulations that consider actual network service and device characteristics.
边缘计算可以通过在网络边缘提供云服务来实现低延迟和实时处理的网络服务。边缘计算具有许多优点,如低延迟、本地化和网络流量分布,但由于其固有的分层、分布式和异构特性,相关的资源管理已成为一项重大挑战。各种基于云的网络服务,如群体感知、分层深度学习系统和云游戏,各自都有其独特的流量模式和计算要求。为了为这些服务提供令人满意的用户体验,需要一种全面考虑服务多样性、客户端使用模式和网络性能指标的资源管理方法。在本研究中,提出了一种在部署提供边缘服务的服务器时同时考虑计算资源和网络流量负载的算法。所提出的算法根据影响流量负载的因素生成候选部署,这些因素包括服务器数量、服务器位置以及根据服务特性和使用情况进行的客户端映射。然后使用一种部分向量装箱方案建立最终的部署计划,该方案同时考虑网络中生成的流量和计算资源。所提出的算法通过考虑实际网络服务和设备特性的若干模拟进行评估。