Gender Classification from Face Images Using Mutual Information and Feature Fusion

39Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this article we report a new method for gender classification from frontal face images using feature selection based on mutual information and fusion of features extracted from intensity, shape, texture, and from three different spatial scales. We compare the results of three different mutual information measures: minimum redundancy and maximal relevance (mRMR), normalized mutual information feature selection (NMIFS), and conditional mutual information feature selection (CMIFS). We also show that by fusing features extracted from six different methods we significantly improve the gender classification results relative to those previously published, yielding 99.13% of the gender classification rate on the FERET database. © 2012 Copyright Taylor and Francis Group, LLC.

Cite

CITATION STYLE

APA

Perez, C., Tapia, J., Estévez, P., & Held, C. (2012). Gender Classification from Face Images Using Mutual Information and Feature Fusion. International Journal of Optomechatronics, 6(1), 92–119. https://doi.org/10.1080/15599612.2012.663463

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free