Deep Learning Forecasting and Statistical Modeling for Q/V-Band LEO Satellite Channels
Article published in IEEE Transactions on Machine Learning in Communications and Networking
As the number of satellite networks increases, the radio spectrum is becoming more congested, prompting the need to explore higher frequencies. However, it is more difficult to operate at higher frequencies due to severe impairments caused by varying atmospheric conditions. Hence, radio channel forecasting is crucial for operators to adjust and maintain the link’s quality. This paper presents a practical approach for Q/V-band modeling for low Earth orbit satellite channels based on tools from machine learning and statistical modeling. The developed Q/V-band LEO satellite channel model is composed of: 1) forecasting method using model-based deep learning, intended for real-time operation of satellite terminals; and 2) statistical channel simulator that generates a time-series path-loss random process, intended for system design and research. Both approaches capitalize on real-measurements obtained from AlphaSat’s Q/V-band transmitter at different geographic latitudes. The results show that model-based deep learning can outperform simple statistical and deep learning methods by at least 50%. Moreover, the model is capable of incorporating varying rain and elevation angle profiles.
Please, follow this link if you would like to read the full article.