Generalization analysis for ranking using integral operator

4Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.

Abstract

The study on generalization performance of ranking algorithms is one of the fundamental issues in ranking learning theory. Although several generalization bounds have been proposed based on different measures, the convergence rates of the existing bounds are usually at most O(√1/n), where n is the size of data set. In this paper, we derive novel generalization bounds for the regularized ranking in reproducing kernel Hilbert space via integral operator of kernel function. We prove that the rates of our bounds are much faster than O(√1/n). Specifically, we first introduce a notion of local Rademacher complexity for ranking, called local ranking Rademacher complexity, which is used to measure the complexity of the space of loss functions of the ranking. Then, we use the local ranking Rademacher complexity to obtain a basic generalization bound. Finally, we establish the relationship between the local Rademacher complexity and the eigenvalues of integral operator, and further derive sharp generalization bounds of faster convergence rate.

Cite

CITATION STYLE

APA

Liu, Y., Liao, S., Lin, H., Yue, Y., & Wang, W. (2017). Generalization analysis for ranking using integral operator. In 31st AAAI Conference on Artificial Intelligence, AAAI 2017 (pp. 2273–2279). AAAI press. https://doi.org/10.1609/aaai.v31i1.10784

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free