Stimulating Deep Learning Network on Graphical Processing Unit To Predict Water Level

  • Singh N
  • et al.
N/ACitations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Deep learning is widespread over different fields like health industries, voice recognition, image & video classification, real-time rendering applications, face recognition and many other domains too. Fundamentally Deep Learning is used due to the three different aspects. The first one is its ability to perform better with a huge amount of data for training, second is high computational speed, and third is the elevation of deep training at various levels of reflection and depiction. Acceleration of Deep Machine Learning requires a platform for immense performance; this needs accelerated hardware for training convoluted deep learning problems. While training large datasets on deep learning that takes hours, days, or weeks, accelerated hardware that decreased the overload of computation task can be used. The main attention of all the research studies is to optimize the results of predictions in terms of accuracy, error rate and execution time. Graphical Processing Unit (GPU) is one of the accelerated hardware that has currently prevailed to decrease the training time due to its parallel architecture. In this research paper, the multi-level or Deep Learning approach is simulated over Central Processing Unit (CPU) and GPU. Different research claims that GPUs deliver accurate results with a maximum rate of speed. MATLAB is the framework used in this work to train the Deep Learning network for predicting Ground Water Level using a dataset of three different parameters Temperature, Rainfall, and Water requirement. Thirteen year’s dataset of Faridabad District of Haryana from the year 2006 to 2018 is used to train, verify, test and analyzed the network over CPU and GPU. The training function used was the trailm for training the network over CPU and trainscg for GPU training as it does not support Jacobian training. From our results, it is concluded that for large dataset the accuracy of training increased with GPU and processing time for training is decreased when compared to CPU. Overall performance improves while training the network over GPU and suits to be a better method for predicting the Water Level. The proficiency estimation of the network shows the maximum regression value, least Mean Square Error (MSE), and highperformance value for GPU during the training.

Cite

CITATION STYLE

APA

Singh, N., & Panda, S. P. (2020). Stimulating Deep Learning Network on Graphical Processing Unit To Predict Water Level. International Journal of Engineering and Advanced Technology, 9(4), 1222–1229. https://doi.org/10.35940/ijeat.d8452.049420

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free