Geo-BERT Pre-training Model for Query Rewriting in POI Search

15Citations
Citations of this article
54Readers
Mendeley users who have this article in their library.

Abstract

Query Rewriting (QR) is proposed to solve the problem of the word mismatch between queries and documents in Web search. Existing approaches usually model QR with an end-to-end sequence-to-sequence (seq2seq) model. The state-of-the-art Transformer-based models can effectively learn textual semantics from user session logs, but they often ignore users' geographic location information that is crucial for the Point-of-Interest (POI) search of map services. In this paper, we proposed a pretraining model, called Geo-BERT, to integrate semantics and geographic information in the pre-trained representations of POIs. Firstly, we simulate POI distribution in the real world as a graph, in which nodes represent POIs and multiple geographic granularities. Then we use graph representation learning methods to get geographic representations. Finally, we train a BERT-like pre-training model with text and POIs' graph embeddings to get an integrated representation of both geographic and semantic information, and apply it in the QR of POI search. The proposed model achieves excellent accuracy on a wide range of real-world datasets of map services.

Cite

CITATION STYLE

APA

Liu, X., Hu, J., Shen, Q., & Chen, H. (2021). Geo-BERT Pre-training Model for Query Rewriting in POI Search. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 2209–2214). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.190

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free