A Study on the Efficiency and Generalization of Light Hybrid Retrieversd

1Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

Abstract

Hybrid retrievers can take advantage of both sparse and dense retrievers. Previous hybrid retrievers leverage indexing-heavy dense retrievers. In this work, we study “Is it possible to reduce the indexing memory of hybrid retrievers without sacrificing performance?” Driven by this question, we leverage an indexing-efficient dense retriever (i.e. DrBoost) and introduce a LITE retriever that further reduces the memory of DrBoost. LITE is jointly trained on contrastive learning and knowledge distillation from DrBoost. Then, we integrate BM25, a sparse retriever, with either LITE or DrBoost to form light hybrid retrievers. Our Hybrid-LITE retriever saves 13× memory while maintaining 98.0% performance of the hybrid retriever of BM25 and DPR. In addition, we study the generalization capacity of our light hybrid retrievers on out-of-domain dataset and a set of adversarial attacks datasets. Experiments showcase that light hybrid retrievers achieve better generalization performance than individual sparse and dense retrievers. Nevertheless, our analysis shows that there is a large room to improve the robustness of retrievers, suggesting a new research direction.

Cite

CITATION STYLE

APA

Luo, M., Jain, S., Gupta, A., Einolghozati, A., Oguz, B., Chatterjee, D., … Heidari, P. (2023). A Study on the Efficiency and Generalization of Light Hybrid Retrieversd. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 1617–1626). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-short.139

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free