Online experimentation for information retrieval

0Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Online experimentation for information retrieval (IR) focuses on insights that can be gained from user interactions with IR systems, such as web search engines. The most common form of online experimentation, A/B testing, is widely used in practice, and has helped sustain continuous improvement of the current generation of these systems. As online experimentation is taking a more and more central role in IR research and practice, new techniques are being developed to address, e.g., questions regarding the scale and fidelity of experiments in online settings. This paper gives an overview of the currently available tools. This includes techniques that are already in wide use, such as A/B testing and interleaved comparisons, as well as techniques that have been developed more recently, such as bandit approaches for online learning to rank. This paper summarizes and connects the wide range of techniques and insights that have been developed in this field to date. It concludes with an outlook on open questions and directions for ongoing and future research.

Cite

CITATION STYLE

APA

Hofmann, K. (2015). Online experimentation for information retrieval. In Communications in Computer and Information Science (Vol. 505, pp. 21–41). Springer Verlag. https://doi.org/10.1007/978-3-319-25485-2_2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free