Abstract
The variance of the bit error rate (BER) of orthogonal frequency division multiplexing (OFDM) transmission with the delay time and the phase of the multi-path has not yet been sufficiently clarified. Multi-path introduces amplitude ripples into OFDM transmission, and these ripples change the C/N of each carrier. These changes, in turn, lead to variations in the BER of each carrier. The BER of the entire OFDM is calculated by adding and averaging the carrier-by-carrier BERs. Using mathematical representations of these factors, we theoretically and experimentally demonstrated that the equivalent C/N degradation from a reference C/N of 22 dB, where the BER of error-correction-free 64-QAM OFDM is 0.009679, oscillates periodically at a delay of 1/(OFDM bandwidth) and approaches the average value as the delay time increases. We found that the BER has either local maxima or local minima at a delay time equivalent to 1/(1·(OFDM carrier interval)), where 1 is a positive integer.
Cite
CITATION STYLE
Miki, N. (2001). A study on transmission performance of digital terrestrial television broadcasting and multi-path. Kyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers, 55(1), 103–111. https://doi.org/10.3169/itej.55.103
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.