Abstract
Feature selection is an indispensable pre-processing technique for selecting more relevant features and eradicating the redundant attributes. Finding the more relevant features for the target is an essential activity to improve the predictive accuracy of the learning algorithms because more irrelevant features in the original feature space will cause more classification errors and consume more time for learning. Many methods have been proposed for feature relevance analysis but no work has been done using Bayes Theorem and Self Information. Thus this paper has been initiated to introduce a novel integrated approach for feature weighting using the measures viz., Bayes Theorem and Self Information and picks the high weighted attributes as the more relevant features using Sequential Forward Selection. The main objective of introducing this approach is to enhance the predictive accuracy of the Naive Bayesian Classifier.
Cite
CITATION STYLE
Mani, K., & Kalpana, P. (2016). An Efficient Feature Selection based on Bayes Theorem, Self Information and Sequential Forward Selection. International Journal of Information Engineering and Electronic Business, 8(6), 46–54. https://doi.org/10.5815/ijieeb.2016.06.06
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.