Deep Learning for mmWave Beam and Blockage Prediction Using Sub-6GHz Channels
Predicting the millimeter wave (mmWave) beams and blockages using sub-6GHz channels has the potential of enabling mobility and reliability in scalable mmWave systems. These gains attracted increasing interest in the last few years. Prior work, however, has focused on extracting spatial channel characteristics at the sub-6GHz band first and then use them to reduce the mmWave beam training overhead. This approach has a number of limitations: (i) It still requires a beam search at mmWave, (ii) its performance is sensitive to the error associated with extracting the sub-6GHz channel characteristics, and (iii) it does not normally account for the different dielectric properties at the different bands. In this paper, we first prove that under certain conditions, there exist mapping functions that can predict the optimal mmWave beam and correct blockage status directly from the sub-6GHz channel, which overcome the limitations in prior work. These mapping functions, however, are hard to characterize analytically which motivates exploiting deep neural network models to learn them. For that, we prove that a large enough neural network can use the sub-6GHz channel to directly predict the optimal mmWave beam and the correct blockage status with success probabilities that can be made arbitrarily close to one. Then, we develop an efficient deep learning model and empirically evaluate its beam/blockage prediction performance using the publicly available dataset DeepMIMO. The results show that the proposed solution can predict the mmWave blockages with more than 90% success probability. Further, these results confirm the capability of the proposed deep learning model in predicting the optimal mmWave beams and approaching the optimal data rates, that assume perfect channel knowledge, while requiring no beam training overhead...
READ FULL TEXT