High-resolution temperature forecasting can often prove to be challenging for conventional machine learning models as temperature is highly seasonal and varies with the time of the year as well as with passing hours of the day. In most cases, only the daily extremes or mean temperatures are provided by temperature forecasting methods. However, with the growing availability of data and the development of deep neural networks (DNNs) capable of detecting complex relationships, high-resolution temperature forecasting is becoming easier. Typically, historical temperature data along with multiple meteorological sensor data is used for temperature forecasting which increases the complexity of the system making it harder and costlier to implement physically. In this paper, high-resolution hourly temperature forecasting is performed using only historical temperature data. The paper presents a comparative analysis among four popular DNNs-simple recurrent neural network (SRN), gated recurrent unit (GRU), long-short term memory (LSTM), convolutional neural network (CNN), and two hybrid models-CNN-LSTM parallel network and GRU-LSTM parallel network trained on Beijing temperature dataset. Experimental results showed GRU-LSTM parallel network obtained the lowest RMSE (1.691° C) whereas CNN has the best computational efficiency obtaining a slightly worse RMSE (1.759° C). Additionally, a robustness analysis is performed on temperature data from four additional geographically diverse locations (Toronto, Las Vegas, Seattle, and Dallas) which reveals GRU to be the most consistent algorithm. Finally, the paper establishes a correlation between the model performance and the dataset based on their variance and mean absolute deviation with reference to the training dataset.
CITATION STYLE
Haque, E., Tabassum, S., & Hossain, E. (2021). A Comparative Analysis of Deep Neural Networks for Hourly Temperature Forecasting. IEEE Access, 9, 160646–160660. https://doi.org/10.1109/ACCESS.2021.3131533
Mendeley helps you to discover research relevant for your work.