Mubarakali Azath, Durai Anand Deva, Alshehri Mohmmed, AlFarraj Osama, Ramakrishnan Jayabrabu, Mavaluru Dinesh
Department of Computer Networks and Communication Engineering, College of Computer Science and King Khalid University, Abha, Saudi Arabia.
College of Computer Science, King Khalid University, Abha, Saudi Arabia.
Big Data. 2023 Apr;11(2):128-136. doi: 10.1089/big.2020.0090. Epub 2020 Jul 14.
Fog computing is playing a vital role in data transmission to distributed devices in the Internet of Things (IoT) and another network paradigm. The fundamental element of fog computing is an additional layer added between an IoT device/node and a cloud server. These fog nodes are used to speed up time-critical applications. Current research efforts and user trends are pushing for fog computing, and the path is far from being paved. Unless it can reap the benefits of applying software-defined networks and network function virtualization techniques, network monitoring will be an additional burden for fog. However, the seamless integration of these techniques in fog computing is not easy and will be a challenging task. To overcome the issues as already mentioned, the fog-based delay-sensitive data transmission algorithm develops a robust optimal technique to ensure the low and predictable delay in delay-sensitive applications such as traffic monitoring and vehicle tracking applications. The method reduces latency by storing and processing the data close to the source of information with optimal depth in the network. The deployment results show that the proposed algorithm reduces 15.67 ms round trip time and 2 seconds averaged delay on 10 KB, 100 KB, and 1 MB data set India, Singapore, and Japan Amazon Datacenter Regions compared with conventional methodologies.
雾计算在向物联网(IoT)中的分布式设备以及另一种网络范式进行数据传输方面发挥着至关重要的作用。雾计算的基本要素是在物联网设备/节点与云服务器之间添加的一层。这些雾节点用于加速对时间要求苛刻的应用程序。当前的研究工作和用户趋势正在推动雾计算的发展,但其道路还远未铺平。除非它能够从应用软件定义网络和网络功能虚拟化技术中获益,否则网络监控将成为雾计算的额外负担。然而,将这些技术无缝集成到雾计算中并非易事,将是一项具有挑战性的任务。为了克服上述问题,基于雾的延迟敏感数据传输算法开发了一种强大的优化技术,以确保在诸如交通监控和车辆跟踪应用等对延迟敏感的应用中实现低延迟和可预测的延迟。该方法通过在网络中以最佳深度在靠近信息源的位置存储和处理数据来减少延迟。部署结果表明,与传统方法相比,所提出的算法在印度、新加坡和日本亚马逊数据中心区域的10KB、100KB和1MB数据集上减少了15.67毫秒的往返时间和2秒的平均延迟。