Dense retrieval (DR) has extended the employment of pre-trained language models, like BERT, for text ranking. However, recent studies have raised the robustness issue of DR model against query variations, like query with typos, along with non-trivial performance losses. Herein, we argue that it would be beneficial to allow the DR model to learn to align the relative positions of query-passage pairs in the representation space, as query variations cause the query vector to drift away from its original position, affecting the subsequent DR effectiveness. To this end, we propose RoDR, a novel robust DR model that learns to calibrate the in-batch local ranking of query variation to that of original query for the DR space alignment. Extensive experiments on MS MARCO and ANTIQUE datasets show that RoDR significantly improves the retrieval results on both the original queries and different types of query variations. Meanwhile, RoDR provides a general query noise-tolerate learning framework that boosts the robustness and effectiveness of various existing DR models. Our code and models are openly available at https://github.com/cxa-unique/RoDR.
CITATION STYLE
Chen, X., Luo, J., He, B., Sun, L., & Sun, Y. (2022). Towards Robust Dense Retrieval via Local Ranking Alignment. In IJCAI International Joint Conference on Artificial Intelligence (pp. 1980–1986). International Joint Conferences on Artificial Intelligence. https://doi.org/10.24963/ijcai.2022/275
Mendeley helps you to discover research relevant for your work.