Transformer Models for Recommending Related Questions in Web Search

7Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

People Also Ask (PAA) is an exciting feature in most of the leading search engines which recommends related questions for a given user query, thereby attempting to reduce the gap between user's information need. This helps users in diving deep into the topic of interest, and reduces task completion time. However, showing unrelated or irrelevant questions is highly detrimental to the user experience. While there has been significant work on query reformulation and related searches, there is hardly any published work on recommending related questions for a query. Question suggestion is challenging because the question needs to be interesting, structurally correct, not be a duplicate of other visible information, and must be reasonably related to the original query. In this paper, we present our system which is based on a Transformer-based neural representation, BERT (Bidirectional Encoder Representations from Transformers), for query, question and corresponding search result snippets. Our best model provides an accuracy of ∼81%.

Cite

CITATION STYLE

APA

Mitra, R., Gupta, M., & Dandapat, S. (2020). Transformer Models for Recommending Related Questions in Web Search. In International Conference on Information and Knowledge Management, Proceedings (pp. 2153–2156). Association for Computing Machinery. https://doi.org/10.1145/3340531.3412067

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free