A comparative analysis of various regularization techniques to solve overfitting problem in artificial neural network

18Citations
Citations of this article
31Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neural networks having a large number of parameters are considered as very effective machine learning tool. But as the number of parameters becomes large, the network becomes slow to use and the problem of overfitting arises. Various ways to prevent overfitting of model are further discussed here and a comparative study has been done for the same. The effects of various regularization methods on the performance of neural net models are observed.

Cite

CITATION STYLE

APA

Gupta, S., Gupta, R., Ojha, M., & Singh, K. P. (2018). A comparative analysis of various regularization techniques to solve overfitting problem in artificial neural network. In Communications in Computer and Information Science (Vol. 799, pp. 363–371). Springer Verlag. https://doi.org/10.1007/978-981-10-8527-7_30

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free