Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations

  • Bickel P
  • Levina E
  • 84


    Mendeley users who have this article in their library.
  • 241


    Citations of this article.


We show that the ‘naive Bayes’ classifier which assumes independent covariates greatly outperforms the Fisher linear discriminant rule under broad conditions when the number of variables grows faster than the number of observations, in the classical problem of discriminating between two normal populations. We also introduce a class of rules spanning the range between independence and arbitrary dependence. These rules are shown to achieve Bayes consistency for the Gaussian ‘coloured noise’ model and to adapt to a spectrum of convergence rates, which we conjecture to be minimax.

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document


  • Peter J. Bickel

  • Elizaveta Levina

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free