BERT-QPP: Contextualized Pre-trained transformers for Query Performance Prediction

29Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Query Performance Prediction (QPP) is focused on estimating the difficulty of satisfying a user query for a certain retrieval method. While most state of the art QPP methods are based on term frequency and corpus statistics, more recent work in this area have started to explore the utility of pretrained neural embeddings, neural architectures and contextual embeddings. Such approaches extract features from pretrained or contextual embeddings for the sake of training a supervised performance predictor. In this paper, we adopt contextual embeddings to perform performance prediction, but distinguish ourselves from the state of the art by proposing to directly fine-tune a contextual embedding, i.e., BERT, specifically for the task of query performance prediction. As such, our work allows the fine-tuned contextual representations to estimate the performance of a query based on the association between the representation of the query and the retrieved documents. We compare the performance of our approach with the state-of-the-art based on the MS MARCO passage retrieval corpus and its three associated query sets: (1) MS MARCO development set, (2) TREC DL 2019, and (3) TREC DL 2020. We show that our approach not only shows significant improved prediction performance compared to all the state-of-the-art methods, but also, unlike past neural predictors, it shows significantly lower latency, making it possible to use in practice.

Cite

CITATION STYLE

APA

Arabzadeh, N., Khodabakhsh, M., & Bagheri, E. (2021). BERT-QPP: Contextualized Pre-trained transformers for Query Performance Prediction. In International Conference on Information and Knowledge Management, Proceedings (pp. 2857–2861). Association for Computing Machinery. https://doi.org/10.1145/3459637.3482063

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free