Decomposable Atomic Norm Minimization Channel Estimation for Millimeter Wave MIMO-OFDM Systems

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper addresses the problem of downlink channel estimation in millimeter wave (mmWave) massive multiple input multiple output (MIMO)-orthogonal frequency division multiplexing (OFDM) systems, where wideband frequency selective fading channels are considered. By exploiting the sparse scattering nature of mmWave channel, we consider channel estimation as three dimensional (3D) (including angles of departure/arrival and the time delay) line spectrum estimation. To achieve super-resolution channel estimation, we propose a decomposable 3D atomic norm minimization estimation method. This method decomposes the 3D estimation problem into two separate dimensions to reduce the computational complexity, where time delays are estimated only in the OFDM system. Simulation results show that the proposed method can achieve comparable mean square errors as the conventional vectorized ANM at much lower computational complexity.

Cite

CITATION STYLE

APA

An, Q., Jing, T., Wen, Y., Duan, Z., & Huo, Y. (2019). Decomposable Atomic Norm Minimization Channel Estimation for Millimeter Wave MIMO-OFDM Systems. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11604 LNCS, pp. 3–15). Springer Verlag. https://doi.org/10.1007/978-3-030-23597-0_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free