The comparison of performance according to initialization methods of deep neural network for malware dataset

ISSN: 22783075
4Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.

Abstract

Training Deep NN, proper initialization of weights leads to good performance. The methods that commonly used to initialize the weights are, limited variance of weight’s values and re-use unsupervised pre-trained weights. In this paper, we proposed the new algorithm that some of weights are used after being pre-trained by the CD method of unsupervised DBN and the other of the weights are initialized using the Xavier or He initialization method. We call these DBNnX and DBNnHe. We compare the performance with several DBNnX, DBNnHe and existing methods. We evaluated and visualized these by using AUC score and using box plot. As the result of experiment, we found the DBN2X and DBN2He are best.

Cite

CITATION STYLE

APA

Kwon, Y. M., Kwon, Y. W., Chung, D. K., & Lim, M. J. (2019). The comparison of performance according to initialization methods of deep neural network for malware dataset. International Journal of Innovative Technology and Exploring Engineering, 8(4S2), 57–62.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free