Clipping Noise Compensation with Neural Networks in OFDM Systems

8Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

The application of deep learning (DL) to solve physical layer issues has emerged as a prominent topic. In this paper, the mitigation of clipping effects for orthogonal frequency division multiplexing (OFDM) systems with the help of a Neural Network (NN) is investigated. Unlike conventional clipping recovery algorithms, which involve costly iterative procedures, the DL-based method learns to directly reconstruct the clipped part of the signal while the unclipped part is protected. Furthermore, an interpretation of the learned weight matrices of the neural network is presented. It is observed that parts of the network, in effect, implement transformations very similar to the (Inverse) Discrete Fourier Transform (DFT/IDFT) to provide information in both the time and frequency domains. The simulation results show that the proposed method outperforms existing algorithms for recovering clipped OFDM signals in terms of both mean square error (MSE) and Bit Error Rate (BER).

Cite

CITATION STYLE

APA

Sang, T. H., & Xu, Y. C. (2020). Clipping Noise Compensation with Neural Networks in OFDM Systems. Signals, 1(1). https://doi.org/10.3390/signals1010005

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free