Feature selection using typical ε:testors, working on dynamical data

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Typical ε:testors are useful to do feature selection in supervised classification problems with mixed incomplete data, where similarity function is not the total coincidence, but it is a one threshold function. In this kind of problems, modifications on the training matrix can appear very frequently. Any modification of the training matrix can change the set of all typical ε:testors, so this set must be recomputed after each modification. But, complexity of algorithms for calculating all typical ε:testors of a training matrix is too high. In this paper we analyze how the set of all typical ε:testors changes after modifications. An alternative method to calculate all typical ε:testors of the modified training matrix is exposed. The new method's complexity is analyzed and some experimental results are shown. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Carrasco-Ochoa, J. A., Ruiz-Shulcloper, J., & De-la-Vega-Doría, L. A. (2004). Feature selection using typical ε:testors, working on dynamical data. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3287, 685–692. https://doi.org/10.1007/978-3-540-30463-0_86

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free