On Improving the Efficiency of Majorization-Minorization for the Inference of Rank Aggregation Models

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Plackett-Luce model represents discrete choices from a set of items and it is often applied to rank aggregation problems. The iterative majorization-minorization method is among the most relevant approaches for finding the maximum likelihood estimation of the parameters of the Plackett-Luce model, but its convergence might be slow. A noninformative initialization is usually adopted which assumes all items are equally relevant at the first step of the iterative inference process. This paper investigates the adoption of approximate inference methods which could allow a better initialization, leading to a smaller number of iterations required for the convergence of majorization-minorization. Two alternatives are adopted: a spectral inference method from the literature and also a novel approach based on a Poisson probabilistic model. Empirical evaluation is performed using synthetic and real-world datasets. It was revealed that initialization provided by an approximate method can lead to statistically significant reductions in both the number of iterations required and also in the overall computational time when compared to the scheme usually adopted for majorization-minorization.

Cite

CITATION STYLE

APA

Emmendorfer, L. R. (2020). On Improving the Efficiency of Majorization-Minorization for the Inference of Rank Aggregation Models. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12319 LNAI, pp. 151–165). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-61377-8_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free