A Comparison of Feature Construction Methods in the Context of Supervised Feature Election for Classification

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In supervised machine learning applications, feature construction may be used to create additional, informative features with the aim to support the prediction of the target output. This study investigates the impact of feature construction, specifically the use of quadratic and interaction terms, on the predictive performance of a classifier. Moreover, the Yager intersection operator is applied as a feature construction method to form additional interaction features. Since feature construction may also create irrelevant features, it can be combined with feature selection to maintain or even reduce the dimensionality of the feature set for model training. In this study, the supervised feature selection method ReliefF is used to rank all features and the k-nearest neighbor classifier is used for predicting the target classes. On the seven real-world data sets contained in this study, the features generated using feature construction are often among the most important features and provide competitive predictive performance.

Cite

CITATION STYLE

APA

Nguyen, D. D., Lohrmann, C., & Luukka, P. (2023). A Comparison of Feature Construction Methods in the Context of Supervised Feature Election for Classification. In Lecture Notes in Networks and Systems (Vol. 567 LNNS, pp. 48–59). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-19694-2_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free