Basic and Applied Science Department, College of Engineering and Technology, Arab Academy for Science and Technology (AAST), Alexandria 1029, Egypt.
Computer Engineering Department, College of Engineering and Technology, Arab Academy for Science and Technology (AAST), Alexandria 1029, Egypt.
Sensors (Basel). 2021 Sep 30;21(19):6555. doi: 10.3390/s21196555.
The co-existence of fifth-generation (5G) and Internet-of-Things (IoT) has become inevitable in many applications since 5G networks have created steadier connections and operate more reliably, which is extremely important for IoT communication. During transmission, IoT devices (IoTDs) communicate with IoT Gateway (IoTG), whereas in 5G networks, cellular users equipment (CUE) may communicate with any destination (D) whether it is a base station (BS) or other CUE, which is known as device-to-device (D2D) communication. One of the challenges that face 5G and IoT is interference. Interference may exist at BSs, CUE receivers, and IoTGs due to the sharing of the same spectrum. This paper proposes an interference avoidance distributed deep learning model for IoT and device to any destination communication by learning from data generated by the Lagrange optimization technique to predict the optimum IoTD-D, CUE-IoTG, BS-IoTD and IoTG-CUE distances for uplink and downlink data communication, thus achieving higher overall system throughput and energy efficiency. The proposed model was compared to state-of-the-art regression benchmarks, which provided a huge improvement in terms of mean absolute error and root mean squared error. Both analytical and deep learning models reached the optimal throughput and energy efficiency while suppressing interference to any destination and IoTG.
第五代(5G)和物联网(IoT)的共存在许多应用中已经变得不可避免,因为 5G 网络已经创建了更稳定的连接并且更可靠地运行,这对于 IoT 通信非常重要。在传输过程中,物联网设备(IoTD)与物联网网关(IoTG)进行通信,而在 5G 网络中,蜂窝用户设备(CUE)可以与任何目的地(D)进行通信,无论是基站(BS)还是其他 CUE,这被称为设备到设备(D2D)通信。5G 和 IoT 面临的挑战之一是干扰。由于共享相同的频谱,干扰可能存在于 BS、CUE 接收器和 IoTG 中。本文提出了一种基于干扰避免的分布式深度学习模型,通过学习拉格朗日优化技术生成的数据来预测上行和下行数据通信中最优的 IoTD-D、CUE-IoTG、BS-IoTD 和 IoTG-CUE 距离,从而实现更高的总体系统吞吐量和能量效率。所提出的模型与最先进的回归基准进行了比较,在平均绝对误差和均方根误差方面有了很大的提高。分析和深度学习模型都达到了最优的吞吐量和能量效率,同时抑制了对任何目的地和 IoTG 的干扰。