Abstract
This paper presents a deep learning algorithm for channel estimation in 5G New Radio (NR). The classical approach that uses neural networks for channel estimation requires more than one stage to obtain the full channel matrix. First, the channel has to be constructed by the received reference signal, and then, the precision is improved. In contrast, to reduce the computational cost, the proposed neural network method generates the channel matrix from the information captured from a few subcarriers along the slot. This information is extrapolated by applying the Least Square technique only on the Demodulation Reference Signal (DMRS). The received DMRS placed in the grid can be seen as a 2D low-resolution image and it is processed to generate the full channel matrix. To reduce complexity in the hardware implementation, the convolutional neural network (CNN) structure is selected. This solution is analyzed comparing the Mean Square Error (MSE) and the computational cost with other deep learning-based channel estimation, as well as the traditional channel estimation methods. It is demonstrated that the proposed neural network delivers substantial complexity savings and favorable error performance. It reduces the computational cost by an order of magnitude, and it has a maximum error discrepancy of 0.018 at 5 dB compared to Minimum Mean Square Error (MMSE) channel estimation.
Original language | English |
---|---|
Article number | 4537 |
Journal | Electronics |
Volume | 13 |
Issue number | 22 |
Early online date | 19 Nov 2024 |
DOIs | |
Publication status | Published - Nov 2024 |
Keywords
- channel estimation
- computational cost
- convolutional neural network
- MIMO system
- MSE analysis