Sharma Ekta, Deo Ravinesh C, Davey Christopher P, Carter Brad D
Artificial Intelligence Applications Laboratory, School of Mathematics, Physics and Computing, University of Southern Queensland, Springfield, QLD 4300, Australia.
Centre for Astrophysics, University of Southern Queensland, Springfield, QLD 4300, Australia.
Sensors (Basel). 2024 Aug 14;24(16):5271. doi: 10.3390/s24165271.
Low-Earth-orbit (LEO) satellites are widely acknowledged as a promising infrastructure solution for global Internet of Things (IoT) services. However, the Doppler effect presents a significant challenge in the context of long-range (LoRa) modulation uplink connectivity. This study comprehensively examines the operational efficiency of LEO satellites concerning the Doppler weather effect, with state-of-the-art artificial intelligence techniques. Two LEO satellite constellations-Globalstar and the International Space Station (ISS)-were detected and tracked using ground radars in Perth and Brisbane, Australia, for 24 h starting 1 January 2024. The study involves modelling the constellation, calculating latency, and frequency offset and designing a hybrid Iterative Input Selection-Long Short-Term Memory Network (IIS-LSTM) integrated model to predict the Doppler weather profile for LEO satellites. The IIS algorithm selects relevant input variables for the model, while the LSTM algorithm learns and predicts patterns. This model is compared with Convolutional Neural Network and Extreme Gradient Boosting (XGBoost) models. The results show that the packet delivery rate is above 91% for the sensitive spread factor 12 with a bandwidth of 11.5 MHz for Globalstar and 145.8 MHz for ISS NAUKA. The carrier frequency for ISS orbiting at 402.3 km is 631 MHz and 500 MHz for Globalstar at 1414 km altitude, aiding in combating packet losses. The ISS-LSTM model achieved an accuracy of 97.51% and a loss of 1.17% with signal-to-noise ratios (SNRs) ranging from 0-30 dB. The XGB model has the fastest testing time, attaining ≈0.0997 s for higher SNRs and an accuracy of 87%. However, in lower SNR, it proves to be computationally expensive. IIS-LSTM attains a better computation time for lower SNRs at ≈0.4651 s, followed by XGB at ≈0.5990 and CNN at ≈0.6120 s. The study calls for further research on LoRa Doppler analysis, considering atmospheric attenuation, and relevant space parameters for future work.
低地球轨道(LEO)卫星被广泛认为是全球物联网(IoT)服务的一种有前途的基础设施解决方案。然而,在远距离(LoRa)调制上行链路连接的背景下,多普勒效应带来了重大挑战。本研究运用最先进的人工智能技术,全面考察了LEO卫星在多普勒气象效应方面的运行效率。从2024年1月1日开始,在澳大利亚珀斯和布里斯班使用地面雷达对两个LEO卫星星座——全球星(Globalstar)和国际空间站(ISS)进行了24小时的探测和跟踪。该研究包括对星座进行建模、计算延迟和频率偏移,以及设计一个混合迭代输入选择-长短期记忆网络(IIS-LSTM)集成模型来预测LEO卫星的多普勒气象剖面。IIS算法为模型选择相关输入变量,而LSTM算法学习并预测模式。将该模型与卷积神经网络和极端梯度提升(XGBoost)模型进行了比较。结果表明,对于敏感扩频因子12,全球星的带宽为11.5 MHz,国际空间站 Nauka模块的带宽为145.8 MHz时,数据包传输率高于91%。在402.3公里高度轨道运行的国际空间站的载波频率为631 MHz,在1414公里高度的全球星的载波频率为500 MHz,有助于减少数据包丢失。国际空间站-LSTM模型在信噪比(SNR)范围为0 - 30 dB时,准确率达到97.51%,损失率为1.17%。XGB模型测试时间最快,在较高信噪比下达到约0.0997秒,准确率为87%。然而,在较低信噪比下,它的计算成本很高。IIS-LSTM在较低信噪比下的计算时间更好,约为0.4651秒,其次是XGB约为0.5990秒,CNN约为0.6120秒。该研究呼吁在未来的工作中,考虑大气衰减和相关空间参数,对LoRa多普勒分析进行进一步研究。