Adam Optimization Algorithm for Wide and Deep Neural Network

  • Jais I
  • Ismail A
  • Nisa S
N/ACitations
Citations of this article
221Readers
Mendeley users who have this article in their library.

Abstract

The objective of this research is to evaluate the effects of Adam when used together with a wide and deep neural network. The dataset used was a diagnostic breast cancer dataset taken from UCI Machine Learning. Then, the dataset was fed into a conventional neural network for a benchmark test. Afterwards, the dataset was fed into the wide and deep neural network with and without Adam. It was found that there were improvements in the result of the wide and deep network with Adam. In conclusion, Adam is able to improve the performance of a wide and deep neural network.

Cite

CITATION STYLE

APA

Jais, I. K. M., Ismail, A. R., & Nisa, S. Q. (2019). Adam Optimization Algorithm for Wide and Deep Neural Network. Knowledge Engineering and Data Science, 2(1), 41. https://doi.org/10.17977/um018v2i12019p41-46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free