Backpropagation Neural Network with Combination of Activation Functions for Inbound Traffic Prediction

  • Purnawansyah P
  • Haviluddin H
  • Darwis H
  • et al.
N/ACitations
Citations of this article
29Readers
Mendeley users who have this article in their library.

Abstract

Predicting network traffic is crucial for preventing congestion and gaining superior quality of network services. This research aims to use backpropagation to predict the inbound level to understand and determine internet usage. The architecture consists of one input layer, two hidden layers, and one output layer. The study compares three activation functions: sigmoid, rectified linear unit (ReLU), and hyperbolic Tangent (tanh). Three learning rates: 0.1, 0.5, and 0.9 represent low, moderate, and high rates, respectively. Based on the result, in terms of a single form of activation function, although sigmoid provides the least RMSE and MSE values, the ReLu function is more superior in learning the high traffic pattern with a learning rate of 0.9. In addition, Re-LU is more powerful to be used in the first order in terms of combination. Hence, combining a high learning rate and pure ReLU, ReLu-sigmoid, or ReLu-Tanh is more suitable and recommended to predict upper traffic utilization

Cite

CITATION STYLE

APA

Purnawansyah, P., Haviluddin, H., Darwis, H., Azis, H., & Salim, Y. (2021). Backpropagation Neural Network with Combination of Activation Functions for Inbound Traffic Prediction. Knowledge Engineering and Data Science, 4(1), 14. https://doi.org/10.17977/um018v4i12021p14-28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free