Three-Layer Stacked Generalization Architecture With Simulated Annealing for Optimum Results in Data Mining

  • Kasthuriarachchi K
  • Liyanage S
N/ACitations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

The combination of different machine learning models to a single prediction model usually improves the performance of the data analysis. Stacking ensembles are one of such approaches to build a high performance classifier that can be applied to various contexts of data mining. This study proposes an enhanced stacking ensemble by collating a few machine learning algorithms with two layered meta classifications to address the limitations of existing stacking architecture to utilize Simulated Annealing Algorithm to optimize the classifier configuration in order to reach the best prediction accuracy. The proposed method significantly outperformed three general stacking ensembles of two layers that have been executed using the meta classifiers utilized in the proposed architecture. These assessments have been statistically proven at a 95% confidence level. The novel stacking ensemble has also outperformed the existing ensembles named; Adaboost algorithm, Gradient boosting algorithm, XGBoost classifier and bagging classifiers as well.

Cite

CITATION STYLE

APA

Kasthuriarachchi, K. T. S., & Liyanage, S. R. (2021). Three-Layer Stacked Generalization Architecture With Simulated Annealing for Optimum Results in Data Mining. International Journal of Artificial Intelligence and Machine Learning, 11(2), 1–27. https://doi.org/10.4018/ijaiml.20210701.oa10

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free