Stochastic AUC Optimization Algorithms With Linear Convergence

16Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.

Abstract

Area under the ROC curve (AUC) is a standard metric that is used to measure classification performance for imbalanced class data. Developing stochastic learning algorithms that maximize AUC over accuracy is of practical interest. However, AUC maximization presents a challenge since the learning objective function is defined over a pair of instances of opposite classes. Existing methods circumvent this issue but with high space and time complexity. From our previous work of redefining AUC optimization as a convex-concave saddle point problem, we propose a new stochastic batch learning algorithm for AUC maximization. The key difference from our previous work is that we assume that the underlying distribution of the data is uniform, and we develop a batch learning algorithm that is a stochastic primal-dual algorithm (SPDAM) that achieves a linear convergence rate. We establish the theoretical convergence of SPDAM with high probability and demonstrate its effectiveness on standard benchmark datasets.

Cite

CITATION STYLE

APA

Natole, M., Ying, Y., & Lyu, S. (2019). Stochastic AUC Optimization Algorithms With Linear Convergence. Frontiers in Applied Mathematics and Statistics, 5. https://doi.org/10.3389/fams.2019.00030

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free