Stability of Random-Projection Based Classifiers. The Bayes Error Perspective

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we investigate the Bayes error and the stability of Bayes’ error when the dimension of the classification problem is reduced using random projections. We restrict our attention to the two-class problem. Furthermore, we assume that distributions in classes come from multivariate normal distributions with the same covariance matrices, i.e., differing only in the means. This is one of the few situations when the Bayes error expression can be written in a simple form of a compact final formula. The bias and the variance of the classification error introduced by random projections are determined. Both full-dimensional normal distributions and singular distributions were considered with a real dimension smaller than the ambient dimension. These results allow for the separation of the impact of random dimension reduction from the impact of the learning sample and provide lower bounds on classification errors. Relatively low variance of the Bayes error introduced by random projections confirms the stability of the random-projection based classifiers, at least under the proposed assumptions.

Cite

CITATION STYLE

APA

Skubalska-Rafajłowicz, E. (2019). Stability of Random-Projection Based Classifiers. The Bayes Error Perspective. In Springer Proceedings in Mathematics and Statistics (Vol. 294, pp. 121–130). Springer. https://doi.org/10.1007/978-3-030-28665-1_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free