Context-Aware Transformer Pre-Training for Answer Sentence Selection

2Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

Abstract

Answer Sentence Selection (AS2) is a core component for building an accurate Question Answering pipeline. AS2 models rank a set of candidate sentences based on how likely they answer a given question. The state of the art in AS2 exploits pre-trained transformers by transferring them on large annotated datasets, while using local contextual information around the candidate sentence. In this paper, we propose three pre-training objectives designed to mimic the downstream fine-tuning task of contextual AS2. This allows for specializing LMs when fine-tuning for contextual AS2. Our experiments on three public and two large-scale industrial datasets show that our pre-training approaches (applied to RoBERTa and ELECTRA) can improve baseline contextual AS2 accuracy by up to 8% on some datasets.

Cite

CITATION STYLE

APA

Di Liello, L., Garg, S., & Moschitti, A. (2023). Context-Aware Transformer Pre-Training for Answer Sentence Selection. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2, pp. 458–468). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-short.40

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free