Improving the accuracy of least-squares probabilistic classifiers

12Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

The least-squares probabilistic classifier (LSPC) is a computationally- efficient alternative to kernel logistic regression. However, to assure its learned probabilities to be non-negative, LSPC involves a post-processing step of rounding up negative parameters to zero, which can unexpectedly influence classification performance. In order to mitigate this problem, we propose a simple alternative scheme that directly rounds up the classifier's negative outputs, not negative parameters. Through extensive experiments including real-world image classification and audio tagging tasks, we demonstrate that the proposed modification significantly improves classification accuracy, while the computational advantage of the original LSPC remains unchanged. Copyright © 2011 The Institute of Electronics, Information and Communication Engineers.

Cite

CITATION STYLE

APA

Yamada, M., Sugiyama, M., Wichern, G., & Simm, J. (2011). Improving the accuracy of least-squares probabilistic classifiers. IEICE Transactions on Information and Systems, E94-D(6), 1337–1340. https://doi.org/10.1587/transinf.E94.D.1337

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free