After clicking “Watch Now” you will be prompted to login or join.
5G Meets Deep Learning, Ray Tracing, and GPUs
Adriana Flores, NVIDIA | Nima Mohammad Pour Nejatian, NVIDIA | Ahmed Alkhateeb, Arizona State University
GTC 2020
Applying deep learning in 5G can potentially enable new functionalities and overcome the existing system's limitations. After a short review of DL-based use cases in wireless physical layer, we'll present a novel neural network architecture called the auto-precoder, a GPU-accelerated DL model that jointly senses the millimeter wave (mmWave) MIMO 5G channel and designs the hybrid precoding matrices with only a few training pilots. The proposed DL model does that by leveraging prior observations of the channel. The lack of accurate training datasets is a key challenge for evaluating and using DL models in wireless systems. To overcome this, we'll show how GPU-accelerated ray tracing algorithms (based on REMCOM technology) can be used to generate accurate training data. We'll also demonstrate the accuracy of the auto-precoder model across different scenarios and benchmark its performance on CPUs and GPUs.