Contextual Re-Ranking with Behavior Aware Transformers

19Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this work, we focus on the contextual document ranking task, which deals with the challenge of user interaction modeling for conversational search. Given a history of user feedback behaviors, such as issuing a query, clicking a document, and skipping a document, we propose to introduce behavior awareness to a neural ranker, resulting in a Hierarchical Behavior Aware Transformers (HBA-Transformers) model. The hierarchy is composed of an intra-behavior attention layer and an inter-behavior attention layer to let the system effectively distinguish and model different user behaviors. Our extensive experiments on the AOL session dataset demonstrate that the hierarchical behavior aware architecture is more powerful than a simple combination of history behaviors. Besides, we analyze the conversational property of queries. We show that coherent sessions tend to be more conversational and thus are more demanding in terms of considering history user behaviors.

Cite

CITATION STYLE

APA

Qu, C., Xiong, C., Zhang, Y., Rosset, C., Croft, W. B., & Bennett, P. (2020). Contextual Re-Ranking with Behavior Aware Transformers. In SIGIR 2020 - Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval (pp. 1589–1592). Association for Computing Machinery, Inc. https://doi.org/10.1145/3397271.3401276

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free