A semantically enhanced text retrieval framework with abstractive summarization

9Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Recently, large pretrained language models (PLMs) have led a revolution in the information retrieval community. In most PLMs-based retrieval frameworks, the ranking performance broadly depends on the model structure and the semantic complexity of the input text. Sequence-to-sequence generative models for question answering or text generation have proven to be competitive, so we wonder whether these models can improve ranking effectiveness by enhancing input semantics. This article introduces SE-BERT, a semantically enhanced bidirectional encoder representation from transformers (BERT) based ranking framework that captures more semantic information by modifying the input text. SE-BERT utilizes a pretrained generative language model to summarize both sides of the candidate passage and concatenate them into a new input sequence, allowing BERT to acquire more semantic information within the constraints of the input sequence's length. Experimental results from two Text Retrieval Conference datasets demonstrate that our approach's effectiveness increasing as the length of the input text increases.

Cite

CITATION STYLE

APA

Pan, M., Li, T., Liu, Y., Pei, Q., Huang, E. A., & Huang, J. X. (2024). A semantically enhanced text retrieval framework with abstractive summarization. Computational Intelligence, 40(1). https://doi.org/10.1111/coin.12603

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free